'심층' 신경망 (DNN)은 하나 이상의 숨겨진 레이어가 있는 인공 신경망 (ANN)에 불과합니다. 이 예에서는 숨겨진 레이어가 하나인 매우 간단한 DNN을 보여줍니다. DNN은 스펙트럼 벡터 (즉, 한 번에 하나의 픽셀)를 입력으로 받아 픽셀당 단일 클래스 라벨과 클래스 확률을 출력합니다. 아래 Colab 노트북에서는 DNN을 만들고, Earth Engine의 데이터로 학습시키고, 내보낸 이미지에서 예측을 실행하고, 예측을 Earth Engine으로 가져오는 방법을 보여줍니다.
Earth Engine (예: 코드 편집기)에서 학습된 모델의 예측을 직접 가져오려면 Google AI Platform에서 모델을 호스팅해야 합니다. 이 가이드에서는 학습된 모델을 SavedModel 형식으로 저장하고, earthengine model prepare 명령어로 호스팅을 위해 모델을 준비하고, ee.Model.fromAiPlatformPredictor를 사용하여 Earth Engine에서 대화형으로 예측을 가져오는 방법을 보여줍니다.
로지스틱 회귀와 같은 기존 머신러닝 방법은 TensorFlow에서 자연스럽게 구현할 수 있습니다. 이 노트북에서는 연간 합성 전후의 로지스틱 회귀 기반의 삼림 벌채 감지기를 보여줍니다. 이 매우 단순한 모델은 데모용으로만 제공됩니다. 더 높은 정확성을 위해 숨겨진 레이어를 몇 개 추가하세요.
'컨볼루션' 신경망 (CNN)에는 하나 이상의 컨볼루션 레이어가 포함되며, 여기서 입력은 픽셀의 이웃이므로 완전히 연결되지는 않지만 공간 패턴을 식별하는 데 적합한 네트워크가 됩니다. 완전 컨볼루션 신경망 (FCNN)에는 완전 연결 레이어가 출력으로 포함되지 않습니다. 즉, 전역 출력 (예: 이미지당 단일 출력)이 아닌 현지화된 출력 (예: 픽셀당)을 학습합니다.
이 Colab 노트북에서는 의학 이미지 세분화를 위해 개발된 FCNN인 UNET 모델을 사용하여 256x256 픽셀 이웃에서 각 픽셀의 연속 [0,1] 출력을 예측하는 방법을 보여줍니다. 특히 이 예에서는 네트워크를 학습시키기 위해 데이터 패치를 내보내는 방법과 추론을 위해 이미지 패치를 오버타일하여 타일 경계 아티팩트를 제거하는 방법을 보여줍니다.
비교적 큰 모델 (예: FCNN 예시)의 경우 Colab 노트북이 실행되는 무료 가상 머신의 수명이 장기 실행 학습 작업에 충분하지 않을 수 있습니다. 특히 평가 데이터 세트에서 예상 예측 오류가 최소화되지 않으면 학습 반복 횟수를 늘리는 것이 좋습니다. Cloud에서 대규모 학습 작업을 실행하기 위해 이 Colab 노트북에서는 학습 코드를 패키징하고, 학습 작업을 시작하고, earthengine model prepare 명령어로 SavedModel를 준비하고, ee.Model.fromAiPlatformPredictor를 사용하여 Earth Engine에서 대화형으로 예측을 가져오는 방법을 보여줍니다.
[null,null,["최종 업데이트: 2025-07-25(UTC)"],[[["\u003cp\u003e\u003cstrong\u003eDeprecated:\u003c/strong\u003e This guide is outdated and uses datasets/methods that may be removed; refer to Vertex AI example workflows instead.\u003c/p\u003e\n"],["\u003cp\u003eThis page provides TensorFlow with Earth Engine example workflows using the Earth Engine Python API and Colab Notebooks.\u003c/p\u003e\n"],["\u003cp\u003eUsing these workflows may incur costs for Google Cloud services like AI Platform and Cloud Storage.\u003c/p\u003e\n"],["\u003cp\u003eThe workflows cover various machine learning techniques, including DNNs, hostable DNN prediction, logistic regression, and FCNNs.\u003c/p\u003e\n"],["\u003cp\u003eTraining large models may require using AI Platform for extended training jobs and efficient prediction deployment.\u003c/p\u003e\n"]]],[],null,["# TensorFlow example workflows\n\n| **Deprecated!** This guide uses datasets that may be removed from the\n| Earth Engine catalog and/or methods that may be removed in future versions of the Earth\n| Engine API. See [Vertex AI example workflows](/earth-engine/guides/ml_examples) instead.\n\n\nThis page has example workflows to demonstrate uses of TensorFlow with Earth Engine. See\n[the TensorFlow page](/earth-engine/guides/tensorflow) for more details. These examples are\nwritten using the [Earth Engine Python API](/earth-engine/guides/python_install) and TensorFlow\nrunning in [Colab Notebooks](https://colab.research.google.com/).\n\nCosts\n-----\n\n| **Warning!** These guides use billable components of Google Cloud including:\n|\n| - AI Platform Training ([pricing](https://cloud.google.com/ai-platform/training/pricing))\n| - AI Platform Prediction ([pricing](https://cloud.google.com/ai-platform/prediction/pricing))\n| - Cloud Storage ([pricing](https://cloud.google.com/storage/pricing))\n|\n| You can use the\n| [Pricing Calculator](https://cloud.google.com/products/calculator) to generate\n| a cost estimate based on your projected usage.\n\nMulti-class prediction with a DNN\n---------------------------------\n\n\nA \"deep\" neural network (DNN) is simply an artificial neural network (ANN) with one or more\nhidden layers. This example demonstrates a very simple DNN with a single hidden layer. The\nDNN takes spectral vectors as inputs (i.e. one pixel at a time) and outputs a single class\nlabel and class probabilities per pixel. The Colab notebook below demonstrates creating\nthe DNN, training it with data from Earth Engine, making predictions on exported imagery and\nimporting the predictions to Earth Engine. \n\n|---------------------------------------------------------------------------------------------------------------------------------------------|-------------------------------------------------------------------------------------------------------------------------|\n| [Run in Google Colab](https://colab.research.google.com/github/google/earthengine-community/blob/master/guides/linked/TF_demo1_keras.ipynb) | [View source on GitHub](https://github.com/google/earthengine-community/blob/master/guides/linked/TF_demo1_keras.ipynb) |\n\nHostable DNN for prediction in Earth Engine\n-------------------------------------------\n\n\nTo get predictions from your trained model directly in Earth Engine (e.g. in the\n[Code Editor](/earth-engine/guides/playground)), you need to host the model\non [Google AI Platform.](https://cloud.google.com/ai-platform) This guide\ndemonstrates how to save a trained model in\n[`SavedModel`](https://cloud.google.com/ml-engine/docs/tensorflow/exporting-for-prediction)\nformat, prepare the model for hosting with the `earthengine model prepare`\ncommand, and get predictions in Earth Engine interactively with\n`ee.Model.fromAiPlatformPredictor`. \n\n|------------------------------------------------------------------------------------------------------------------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------|\n| [Run in Google Colab](https://colab.research.google.com/github/google/earthengine-community/blob/master/guides/linked/Earth_Engine_TensorFlow_AI_Platform.ipynb) | [View source on GitHub](https://github.com/google/earthengine-community/blob/master/guides/linked/Earth_Engine_TensorFlow_AI_Platform.ipynb) |\n\nLogistic regression the TensorFlow way\n--------------------------------------\n\n\nClassical machine learning methods such as logistic regression are natural to implement\nin TensorFlow. This notebook demonstrates a logistic regression based deforestation\ndetector from before and after annual composites. Note that this very simplistic model\nis just for demonstration purposes; add a few hidden layers for higher accuracy. \n\n|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------|------------------------------------------------------------------------------------------------------------------------------------------------------|\n| [Run in Google Colab](https://colab.research.google.com/github/google/earthengine-community/blob/master/guides/linked/Earth_Engine_TensorFlow_logistic_regression.ipynb) | [View source on GitHub](https://github.com/google/earthengine-community/blob/master/guides/linked/Earth_Engine_TensorFlow_logistic_regression.ipynb) |\n\nRegression with an FCNN\n-----------------------\n\n\nA \"convolutional\" neural network (CNN) contains one or more convolutional layers, in which\ninputs are neighborhoods of pixels, resulting in a network that is not fully-connected, but\nis suited to identifying spatial patterns. A fully convolutional neural network (FCNN) does\nnot contain a fully-connected layer as output. This means that it does not learn a global\noutput (i.e. a single output per image), but rather localized outputs (i.e. per-pixel).\n\nThis Colab notebook demonstrates the use of the\n[UNET model](https://arxiv.org/abs/1505.04597), an FCNN developed for medical\nimage segmentation, for predicting a continuous \\[0,1\\] output in each pixel from 256x256\nneighborhoods of pixels. Specifically, this example shows how to export patches of data to\ntrain the network and how to overtile image patches for inference, to eliminate tile boundary\nartifacts. \n\n|---------------------------------------------------------------------------------------------------------------------------------------------------|-------------------------------------------------------------------------------------------------------------------------------|\n| [Run in Google Colab](https://colab.research.google.com/github/google/earthengine-community/blob/master/guides/linked/UNET_regression_demo.ipynb) | [View source on GitHub](https://github.com/google/earthengine-community/blob/master/guides/linked/UNET_regression_demo.ipynb) |\n\nTraining on AI Platform\n-----------------------\n\n\nFor relatively large models (like the FCNN example), the longevity of the free virtual\nmachine on which Colab notebooks run may not be sufficient for a long-running training\njob. Specifically, if the expected prediction error is not minimized on the evaluation\ndataset, then more training iterations may be prudent. For performing large training\njobs in the Cloud, this Colab notebook demonstrates how to\n[package your training\ncode](https://cloud.google.com/ml-engine/docs/packaging-trainer), [start a\ntraining job](https://cloud.google.com/ml-engine/docs/training-jobs), prepare a\n[`SavedModel`](https://cloud.google.com/ml-engine/docs/tensorflow/exporting-for-prediction)\nwith the `earthengine model prepare` command, and get predictions in Earth\nEngine interactively with `ee.Model.fromAiPlatformPredictor`. \n\n|-----------------------------------------------------------------------------------------------------------------------------------------------|---------------------------------------------------------------------------------------------------------------------------|\n| [Run in Google Colab](https://colab.research.google.com/github/google/earthengine-community/blob/master/guides/linked/AI_platform_demo.ipynb) | [View source on GitHub](https://github.com/google/earthengine-community/blob/master/guides/linked/AI_platform_demo.ipynb) |"]]