Lighting Estimation API는 불연속 시각적 신호를 위해 지정된 이미지를 분석하고 지정된 장면의 조명에 대한 자세한 정보를 제공합니다. 그런 다음 이 정보를 사용하여 가상 객체를 렌더링할 때 배치되는 장면과 동일한 조건에서 밝게 하여 이러한 객체가 더 현실적으로 느껴지고 사용자의 몰입형 환경을 향상할 수 있습니다.
조명 신호 및 개념
사람들은 주변 환경에서 사물이나 살아 있는 사물에 불이 들어오는 방식에 대해 미의식으로 인지합니다. 가상 객체에 그림자가 없거나 주변 공간을 반영하지 않는 반짝이는 광물이 있는 경우 사용자는 이유를 설명할 수 없더라도 객체가 특정 장면에 잘 맞지 않는다는 것을 감지할 수 있습니다. 이러한 이유로 장면의 조명에 맞춰 AR 객체를 렌더링하는 것은 몰입도 높고 더 현실적인 경험을 위해 중요합니다.
조명 추정은 가상 객체를 렌더링할 때 다양한 조명 신호를 모방할 수 있는 세부 데이터를 제공하여 대부분의 작업을 자동으로 실행합니다.
이러한 신호는 그림자, 주변광, 음영, 반사 하이라이트, 반사입니다.
이 시각적 신호는 다음과 같이 설명할 수 있습니다.
주변광 주변광은 주변에서 들어오는 전체 디퓨즈 라이트로, 모든 것을 밝힙니다.
그림자. 그림자는 종종 방향이며 시청자에게 광원의 출처를 알려줍니다.
셰이딩. 음영은 특정 이미지의 여러 영역에서 발생하는 빛의 강도입니다. 예를 들어 동일한 객체의 여러 부분은 뷰어를 기준으로 한 각도와 광원과의 근접도에 따라 같은 장면에서 다양한 음영 수준을 가질 수 있습니다.
스펙트럼 주요 사항: 광원을 직접 반영하는 반짝이는 표면입니다. 장면의 뷰어 위치를 기준으로 객체의 강조표시가 변경됩니다.
돌아보기 빛이 표면에서 반사된 (즉, 높은 반사) 또는 디퓨즈 (반사되지 않은) 속성을 가지고 있는지에 따라 표면에서 빛이 반사됩니다. 예를 들어 금속 공은 매우 반사되어 환경을 반영하는 반면 다른 공은 은은한 매트 회색을 칠하면 디퓨즈됩니다. 대부분의 실제 물건은 이러한 속성을 모두 가지고 있습니다. 볼링공이나 잘 사용되는 신용카드를 떠올려 보세요.
반사 표면은 주변 환경에서 색상을 선택합니다. 객체의 색상은 환경 색상의 직접적인 영향을 받을 수 있습니다. 예를 들어, 파란색 방에 있는 흰색 공은 푸른 빛을 냅니다.
[null,null,["최종 업데이트: 2022-09-26(UTC)"],[[["\u003cp\u003eThe Lighting Estimation API analyzes images to understand scene lighting, allowing virtual objects to be rendered realistically within the scene.\u003c/p\u003e\n"],["\u003cp\u003eThis API enhances realism by mimicking lighting cues such as shadows, ambient light, shading, specular highlights, and reflections.\u003c/p\u003e\n"],["\u003cp\u003eTwo primary modes are available: \u003ccode\u003eENVIRONMENTAL_HDR\u003c/code\u003e for detailed lighting estimation and \u003ccode\u003eAMBIENT_INTENSITY\u003c/code\u003e for basic lighting adjustments.\u003c/p\u003e\n"],["\u003cp\u003e\u003ccode\u003eENVIRONMENTAL_HDR\u003c/code\u003e uses machine learning to analyze directional lighting, ambient spherical harmonics, and HDR cubemap for realistic rendering.\u003c/p\u003e\n"],["\u003cp\u003e\u003ccode\u003eAMBIENT_INTENSITY\u003c/code\u003e estimates average pixel intensity and color correction for simpler use cases or objects with baked-in lighting.\u003c/p\u003e\n"]]],["The Lighting Estimation API enhances realism in AR by analyzing image lighting and providing data to mimic visual cues like shadows, ambient light, shading, specular highlights, and reflections. It offers modes like `ENVIRONMENTAL_HDR`, which utilizes directional lighting, ambient spherical harmonics, and HDR cubemaps for accurate object rendering, and `AMBIENT_INTENSITY` for simpler, less precise lighting. Developers configure these modes by overriding the `Config.LightEstimationMode` setting. These different settings help virtual objects fit more naturally into the real-world environment.\n"],null,["# Using Lighting Estimation with Sceneform models\n\nThe **Lighting Estimation** API analyzes a given image for discrete visual cues\nand provides detailed information about the lighting in a given scene. You can\nthen use this information when rendering virtual objects to light them under the\nsame conditions as the scene they're placed in, making these objects feel more\nrealistic and enhancing the immersive experience for users.\n\nLighting cues and concepts\n--------------------------\n\nHumans unconsciously perceive subtle cues regarding how objects or living things\nare lit in their environment. When a virtual object is missing a shadow or has a\nshiny material that doesn't reflect the surrounding space, users can sense\nthe object doesn't quite fit into a particular scene even if they can't explain\nwhy. This is why rendering AR objects to match the lighting in a scene is\ncrucial for immersive and more realistic experiences.\n\nLighting Estimation does most of the work for you by providing detailed data\nthat lets you mimic various lighting cues when rendering virtual objects.\nThese cues are **shadows** , **ambient light** , **shading** , **specular\nhighlights** , and **reflections**.\n\nWe can describe these visual cues like this:\n\n- **Ambient light.** Ambient light is the overall diffuse light that comes in\n from around the environment, lighting everything.\n\n- **Shadows.** Shadows are often directional and tell viewers where light sources\n are coming from.\n\n- **Shading.** Shading is the intensity of the light in different areas of a\n given image. For example, different parts of the same object can have\n different levels of shading in the same scene depending on angle relative to\n the viewer, and its proximity to a light source.\n\n- **Specular highlights.** These are the shiny bits of surfaces that reflect a\n light source directly. Highlights on an object change relative to the\n position of a viewer in a scene.\n\n- **Reflection.** Light bounces off of surfaces differently depending on whether\n the surface has specular (that is, highly reflective) or diffuse (not\n reflective) properties. For example, a metallic ball will be highly specular\n and reflect its environment, while another ball painted a dull matte grey will\n be diffuse. Most real-world objects have a combination of these properties --\n think of a scuffed-up bowling ball or a well-used credit card.\n\n Reflective surfaces also pick up **colors** from the ambient environment. The\n coloring of an object can be directly affected by the coloring of its\n environment. For example, a white ball in a blue room will take on a bluish\n hue.\n\nUsing Lighting Estimation modes to enhance realism\n--------------------------------------------------\n\nThe [`Config.LightEstimationMode`](/ar/reference/java/arcore/reference/com/google/ar/core/Config.LightEstimationMode) API has\nmodes that estimate lighting in the environment with different degrees of\ngranularity and realism.\n\n- [Environmental HDR mode](#sceneform-ehdr) (`ENVIRONMENTAL_HDR`). This mode\n consists of an API that allows realistic lighting estimation for directional\n lighting, shadows, specular highlights, and reflections.\n\n- [Ambient Intensity mode](#ambient-intensity) (`AMBIENT_INTENSITY`). This\n mode determines the average pixel intensity and the color of the lighting for\n a given image. It's a coarse setting designed for use cases in which precise\n lighting is not critical, such as objects that have baked-in lighting.\n\n- `DISABLED`. Disable `Config.LightEstimationMode` if lighting to match a\n given environment is not relevant for a scene or an object.\n\n | **Note:** When your app is using `Config.LightEstimationMode`, only significant changes to lighting or positioning of the device trigger updated effects.\n\n### Using `ENVIRONMENTAL_HDR` mode\n\n**`ENVIRONMENTAL_HDR`** mode uses machine learning to analyze the input\ncamera image and synthesize environmental lighting for rendering a virtual\nobject.\n\nThis mode combines directional lighting, ambient spherical harmonics, and an\nHDR cubemap to make virtual objects feel like they're physically part of a\ngiven scene:\n\n- **Directional lighting** analyzes the apparent light source for a given image.\n This kind of lighting adds reasonably positioned specular highlights, and\n casts shadows in a direction consistent with other visible real objects.\n\n- **Ambient spherical harmonics** get a realistic representation of the overall\n ambient light coming in from all directions in a scene. During rendering,\n this information is used to add subtle cues that bring out the definition of\n virtual objects.\n\n- An **HDR cubemap** captures the environmental lighting surrounding the virtual\n object. During rendering, this cubemap creates the reflection for the medium\n to high glossiness material.\n\nThe following image shows an example of a virtual object placed in a scene\nwith `ENVIRONMENTAL_HDR` enabled.\n\n#### Configure `ENVIRONMENTAL_HDR` mode for a Sceneform scene\n\nTo use `ENVIRONMENTAL_HDR` with a Sceneform scene, extend the `ARFragment`\nclass, and override the configuration as follows: \n\n @Override\n protected Config getSessionConfiguration(Session session) {\n Config config = new Config(session);\n config.setLightEstimationMode(Config.LightEstimationMode.ENVIRONMENTAL_HDR);\n return config;\n }\n\nTo see an example of how this works, see the [Solar System sample](//github.com/google-ar/sceneform-android-sdk/tree/v1.15.0/samples/solarsystem). (This sample implements\n`ENVIRONMENTAL_HDR` without using `ARFragment`.)\n\n### Using `AMBIENT_INTENSITY` mode\n\n`AMBIENT_INTENSITY` mode determines the average pixel intensity and the\ncolor correction scalars for a given image. It's a coarse setting designed for\nuse cases in which precise lighting is not critical, such as objects that have\nbaked-in lighting.\n\n- **Pixel intensity** captures the average pixel intensity of the lighting in a\n scene, for applying to a whole virtual object.\n\n- **Color correction scalars** detect the white balance for each individual\n frame, and allows you to color correct a virtual object so that it integrates\n more smoothly into the overall coloring of a scene.\n\n#### Configure `AMBIENT_INTENSITY` mode for a Sceneform scene\n\nTo use `AMBIENT_INTENSITY` with a Sceneform scene, extend the `ARfragment`\nclass, and override the configuration as follows: \n\n @Override\n protected Config getSessionConfiguration(Session session) {\n Config config = new Config(session);\n config.setLightEstimationMode(Config.LightEstimationMode.AMBIENT_INTENSITY);\n return config;\n }"]]