Note: This is a preview feature and is subject to change. Any Scenes that use this feature may need updating in a future release. Do not rely on this feature for full-scale production until it is officially released. Show The GPU Lightmapper gives you an interactive workflow when you’re setting up and tweaking the lighting in your scene. Because this lightmapper uses the GPU in your computer to generate baked lightmaps and Light probes, it’s a faster alternative to the CPU Progressive Lightmapper. Sampling and noise patterns look slightly different than those produced by the CPU lightmapper, because the sampling algorithm is different. Hardware and software requirementsTo use the Progressive GPU Lightmapper, your computer must have:
If your computer has more than one GPU, Unity automatically selects one for rendering and a different one for light baking. To change this, see: . Note: If the baking process uses more than the available GPU memory, the process can fall back to the CPU Lightmapper. Some drivers with virtual memory support swap to CPU memory instead, which makes the baking process slower. Selecting the Progressive GPU LightmapperTo select the Progressive GPU Lightmapper in the Unity Editor: In your Project, go to Window > Rendering > Lighting Settings. Under Lightmapping Settings, find the Lightmapper property, and select Progressive GPU (Preview) in the drop-down menu. Select the Progressive GPU Lightmapper under Lightmapper in Lightmapping Settings.Configuring which GPU to useIf your computer has more than one GPU, Unity automatically uses one GPU for rendering the Scene and the other GPU for baking lightmaps. If the GPU assignments don’t fit your needs, you can specify which graphics card to use for baking. To see which GPU Unity currently uses for baking:
To see the available GPUs in your machine:
To select a specific GPU for baking:
Your choice of assignment should depend on your needs while you’re working on the Scene. If you assign the strongest GPU to either activity, this can incur a cost on the other activity. If you encounter issues, try re-assigning GPUs. This page doesn’t contain information about console platforms. For information about console platforms, see the platform-specific documentation. For an overview of texture formats, see Texture formats. For information about texture import settings and how to set per-texture platform-specific overrides, see Texture import settings. Some texture import settings can also be overridden globally in Build Settings, mostly to speed up iteration time during development. TerminologyThis page uses the following terminology:
Recommended texture formats, by platformDesktopFor devices with DirectX 11 or better class GPUs, where support for BC7 and BC6H formats is guaranteed to be available, the recommended choice of compression formats is: RGB textures - DXT1 at four bits/pixel. RGBA textures - BC7 (higher quality, slower to compress) or DXT5 (faster to compress), both at eight bits/pixel. HDR textures - BC6H at eight bits/pixel. If you need to support DirectX 10 class GPUs on PC (NVIDIA GPUs before 2010, AMD before 2009, Intel before 2012), then DXT5 instead of BC7 would be preferred, since these GPUs do not support BC7 nor BC6H. See the for detailed information about all supported formats. iOS and tvOSFor Apple devices that use the A8 chip (2014) or above, ASTC is the recommended texture format for RGB and RGBA textures. This format allows you to choose between texture quality and size on a granular level: all the way from eight bits/pixel (4x4 block size) down to 0.89 bits/pixel (12x12 block size). If support for older devices is needed, or you want additional Crunch compression, then Apple devices support ETC/ETC2 formats starting with A7 chip (2013). For even older devices, PVRTC is the format to use. On iOS you can configure the default texture format in the Player SettingsSettings that let you set various player-specific options for the final game built by Unity. More info See in . PVRTC gives you the broadest possible compatibility. ASTC is preferred, but is not supported on A7 devices (the very first Metal-enabled devices) and will be unpacked at runtime. See the for detailed information about all supported formats. AndroidTexture format support on Android is complicated. You might need to build several application versions with different sub-targets. You can select the default format in . Your options are ASTC, ETC2 and ETC (ETC1 for RGB, ETC2 for RGBA). See for more details on how the different settings interact. For LDR RGB and RGBA textures, most modern Android GPUs that support OpenGL ES 3.1 or Vulkan also support ASTC format, including: Qualcomm GPUs since Adreno 4xx / Snapdragon 415 (2015), ARM GPUs since Mali T624 (2012), NVIDIA GPUs since Tegra K1 (2014), PowerVR GPUs since GX6250 (2014). If you need support for older devices, or you want additional Crunch compression, then all GPUs that run Vulkan or OpenGL ES 3.0 support the ETC2 format. The resulting image quality is quite high, and it supports one- to four-component texture data. OpenGL ES 2 devices do not support the ETC2 format, so Unity decompresses the texture at runtime to the format specifies. For even older devices, usually only ETC format is available. The drawback is that there is no direct alpha channel support. For SpritesA 2D graphic objects. If you are used to working in 3D, Sprites are essentially just standard textures but there are special techniques for combining and managing sprite textures for efficiency and convenience during development. More info See in , Unity offers an option to use ETC1 compression by splitting a texture into two ETC1 textures: one for RGB, one for alpha. To enable this, enable the Android-specific option for the Texture when importing a Sprite AtlasA texture that is composed of several smaller textures. Also referred to as a texture atlas, image sprite, sprite sheet or packed texture. More info See in . The sprite shaderA program that runs on the GPU. More info See in samples both textures and combines them into the final result. For HDR textures, ASTC HDR is the only compressed format available on Android devices. ASTC HDR requires Vulkan or GL_KHR_texture_compression_astc_hdr support. ASTC is the most flexible format. If a device doesn’t support ASTC HDR the texture is decompressed at runtime to RGB9e5 or RGBA Half, depending on alpha channel usage. For devices that don’t support ASTC HDR, all devices running Vulkan, Metal, or OpenGL ES 3.0 support RGB9e5, which is suitable for textures without an alpha channel. If an alpha channel or even wider support is needed, use RGBA Half. This takes twice as much memory as RGB9e5. |