Force use of gpu for 2d using là gì năm 2024

Note: This is a preview feature and is subject to change. Any Scenes that use this feature may need updating in a future release. Do not rely on this feature for full-scale production until it is officially released.

The GPU Lightmapper gives you an interactive workflow when you’re setting up and tweaking the lighting in your scene. Because this lightmapper uses the GPU in your computer to generate baked lightmaps and Light probes, it’s a faster alternative to the CPU Progressive Lightmapper. Sampling and noise patterns look slightly different than those produced by the CPU lightmapper, because the sampling algorithm is different.

Hardware and software requirements

To use the Progressive GPU Lightmapper, your computer must have:

  • A Windows operating system
  • At least one GPU with OpenCL 1.2 support
  • At least 2GB of dedicated GPU memory
  • CPU that supports SSE4.1 instructions

If your computer has more than one GPU, Unity automatically selects one for rendering and a different one for light baking. To change this, see: .

Note: If the baking process uses more than the available GPU memory, the process can fall back to the CPU Lightmapper. Some drivers with virtual memory support swap to CPU memory instead, which makes the baking process slower.

Selecting the Progressive GPU Lightmapper

To select the Progressive GPU Lightmapper in the Unity Editor: In your Project, go to Window > Rendering > Lighting Settings. Under Lightmapping Settings, find the Lightmapper property, and select Progressive GPU (Preview) in the drop-down menu.

Force use of gpu for 2d using là gì năm 2024
Select the Progressive GPU Lightmapper under Lightmapper in Lightmapping Settings.

Configuring which GPU to use

If your computer has more than one GPU, Unity automatically uses one GPU for rendering the Scene and the other GPU for baking lightmaps. If the GPU assignments don’t fit your needs, you can specify which graphics card to use for baking.

To see which GPU Unity currently uses for baking:

  • In the Editor, open the Lighting window. Next to Bake Performance, you can see the GPU.

To see the available GPUs in your machine:

  1. Make sure you’ve .
  2. Generate the lighting in your Scene.
  3. Open File Explorer, and navigate to the following path: C:\Users\USER\AppData\Local\Unity\Editor.
  4. Open the file called Editor.log.
  5. In the file, search for the line Listing OpenCL platforms. This should jump to the part of the log with information about OpenCL devices. Here, you can see your available GPUs along with their corresponding platform and device indexes.

To select a specific GPU for baking:

  • At the Command Line, enter this command (replacing platform and device index with the relevant numbers):

Unity.exe "-OpenCL-PlatformAndDeviceIndices" <platform> <device index>

Your choice of assignment should depend on your needs while you’re working on the Scene. If you assign the strongest GPU to either activity, this can incur a cost on the other activity. If you encounter issues, try re-assigning GPUs.

This page doesn’t contain information about console platforms. For information about console platforms, see the platform-specific documentation.

For an overview of texture formats, see Texture formats. For information about texture import settings and how to set per-texture platform-specific overrides, see Texture import settings. Some texture import settings can also be overridden globally in Build Settings, mostly to speed up iteration time during development.

Terminology

This page uses the following terminology:

  • Bits per pixelThe smallest unit in a computer image. Pixel size depends on your screen resolution. Pixel lighting is calculated at every screen pixel. More info See in (bpp) is the amount of storage required for a single texture pixel. Textures with a lower bpp value have a smaller size on disk and in memory. A lower bpp value also means that the GPU can store more pixels in its cache, which results in faster texture access.
  • LDR (Low Dynamic Range) refers to most typical images where colors are conceptually between 0.0 (black) and 1.0 (white) values. The majority of image files (such as PNG and JPG) have low dynamic range.
  • HDRhigh dynamic range See in (High Dynamic Range) refers to special image and texture formats where colors can have a higher range than 0 through 1. Image file formats like .exr or .hdr are often used for HDR image data. At runtime and on the GPU, there are several HDR formats, trading off accuracy, range and memory usage.
  • RGB is a color model in which red, green and blue combine to reproduce an array of colors.
  • RGBA is a version of RGB with an alpha channel, which supports blending and opacity alteration.
  • Variable bit rate (VBR) means that bits per pixel is not a fixed value, and depends on the actual content instead. VBR only applies to Crunch compression, and only texture size on disk. The size in memory is the same as when using the underlying texture format (for example, RGB Compressed DXT1 for RGB Crunched DXT1).

Desktop

For devices with DirectX 11 or better class GPUs, where support for BC7 and BC6H formats is guaranteed to be available, the recommended choice of compression formats is: RGB textures - DXT1 at four bits/pixel. RGBA textures - BC7 (higher quality, slower to compress) or DXT5 (faster to compress), both at eight bits/pixel. HDR textures - BC6H at eight bits/pixel. If you need to support DirectX 10 class GPUs on PC (NVIDIA GPUs before 2010, AMD before 2009, Intel before 2012), then DXT5 instead of BC7 would be preferred, since these GPUs do not support BC7 nor BC6H.

See the for detailed information about all supported formats.

iOS and tvOS

For Apple devices that use the A8 chip (2014) or above, ASTC is the recommended texture format for RGB and RGBA textures. This format allows you to choose between texture quality and size on a granular level: all the way from eight bits/pixel (4x4 block size) down to 0.89 bits/pixel (12x12 block size). If support for older devices is needed, or you want additional Crunch compression, then Apple devices support ETC/ETC2 formats starting with A7 chip (2013). For even older devices, PVRTC is the format to use. On iOS you can configure the default texture format in the Player SettingsSettings that let you set various player-specific options for the final game built by Unity. More info See in . PVRTC gives you the broadest possible compatibility. ASTC is preferred, but is not supported on A7 devices (the very first Metal-enabled devices) and will be unpacked at runtime.

See the for detailed information about all supported formats.

Android

Texture format support on Android is complicated. You might need to build several application versions with different sub-targets.

You can select the default format in . Your options are ASTC, ETC2 and ETC (ETC1 for RGB, ETC2 for RGBA). See for more details on how the different settings interact.

For LDR RGB and RGBA textures, most modern Android GPUs that support OpenGL ES 3.1 or Vulkan also support ASTC format, including: Qualcomm GPUs since Adreno 4xx / Snapdragon 415 (2015), ARM GPUs since Mali T624 (2012), NVIDIA GPUs since Tegra K1 (2014), PowerVR GPUs since GX6250 (2014).

If you need support for older devices, or you want additional Crunch compression, then all GPUs that run Vulkan or OpenGL ES 3.0 support the ETC2 format. The resulting image quality is quite high, and it supports one- to four-component texture data. OpenGL ES 2 devices do not support the ETC2 format, so Unity decompresses the texture at runtime to the format specifies. For even older devices, usually only ETC format is available. The drawback is that there is no direct alpha channel support. For SpritesA 2D graphic objects. If you are used to working in 3D, Sprites are essentially just standard textures but there are special techniques for combining and managing sprite textures for efficiency and convenience during development. More info See in , Unity offers an option to use ETC1 compression by splitting a texture into two ETC1 textures: one for RGB, one for alpha. To enable this, enable the Android-specific option for the Texture when importing a Sprite AtlasA texture that is composed of several smaller textures. Also referred to as a texture atlas, image sprite, sprite sheet or packed texture. More info See in . The sprite shaderA program that runs on the GPU. More info See in samples both textures and combines them into the final result.

For HDR textures, ASTC HDR is the only compressed format available on Android devices. ASTC HDR requires Vulkan or GL_KHR_texture_compression_astc_hdr support. ASTC is the most flexible format. If a device doesn’t support ASTC HDR the texture is decompressed at runtime to RGB9e5 or RGBA Half, depending on alpha channel usage.

For devices that don’t support ASTC HDR, all devices running Vulkan, Metal, or OpenGL ES 3.0 support RGB9e5, which is suitable for textures without an alpha channel. If an alpha channel or even wider support is needed, use RGBA Half. This takes twice as much memory as RGB9e5.