Tetap teratur dengan koleksi
Simpan dan kategorikan konten berdasarkan preferensi Anda.
Memperbarui file robots.txt
Untuk memperbarui aturan di file robots.txt yang sudah ada, download salinan file robots.txt
dari situs Anda dan lakukan pengeditan yang diperlukan.
Mendownload file robots.txt
Anda dapat mendownload file robots.txt melalui berbagai cara, misalnya:
Buka file robots.txt, misalnya https://example.com/robots.txt,
lalu salin isinya untuk ditempelkan ke file teks baru di komputer. Pastikan Anda mengikuti panduan
terkait
format file
saat membuat file lokal baru.
Download salinan asli file robots.txt dengan alat seperti cURL. Contoh:
curl https://example.com/robots.txt -o robots.txt
Gunakan
laporan robots.txt
di Search Console untuk menyalin isi file robots.txt, yang kemudian dapat Anda tempel ke
file di komputer.
Mengedit file robots.txt
Di editor teks, buka file robots.txt yang sudah Anda download dari situs, dan lakukan pengeditan yang
diperlukan pada aturan. Pastikan Anda menggunakan
sintaksis yang tepat
dan juga menyimpan file dengan encoding UTF-8.
Mengupload file robots.txt
Upload file robots.txt baru ke direktori root situs Anda sebagai file teks yang bernama
robots.txt. Cara Anda mengupload file ke situs Anda sangat bergantung pada server dan platform. Baca
tips dari kami untuk mendapatkan bantuan terkait cara
mengupload file robots.txt ke situs.
Memuat ulang cache robots.txt Google
Selama proses crawling otomatis, Crawler Google mengetahui perubahan yang Anda buat pada file
robots.txt dan memperbarui versi yang di-cache setiap 24 jam. Jika Anda perlu memperbarui
cache lebih cepat, gunakan fungsi Minta crawl ulang pada
laporan robots.txt.
[null,null,["Terakhir diperbarui pada 2025-08-04 UTC."],[[["\u003cp\u003eThe robots.txt file controls which parts of your website search engines can crawl and index.\u003c/p\u003e\n"],["\u003cp\u003eYou can edit your robots.txt file by downloading it, making changes, and uploading it back to your website's root directory.\u003c/p\u003e\n"],["\u003cp\u003eWebsite hosting services may offer alternative ways to manage search engine crawling, such as through their settings pages.\u003c/p\u003e\n"],["\u003cp\u003eGoogle automatically updates its cached version of your robots.txt file, but you can request a faster refresh using Search Console.\u003c/p\u003e\n"]]],["To modify your `robots.txt` file, first download a copy using methods like direct access, cURL, or the Search Console report. Then, edit the file in a text editor, ensuring correct syntax and UTF-8 encoding. Upload the updated file to your site's root directory, typically named `robots.txt`. If direct uploading is not possible, contact your domain manager. Google automatically updates its cached version every 24 hours, or you can use the robots.txt report to request faster recrawling.\n"],null,["# Submit Updated Robots.txt to Google | Google Search Central\n\nUpdate your robots.txt file\n===========================\n\n|\n| **If you use a site hosting service, such as Wix or Blogger**, you might not need to (or\n| be able to) edit your robots.txt file directly. Instead, your provider might expose a search\n| settings page or some other mechanism to tell search engines whether or not to crawl your\n| page.\n|\n|\n| If you want to hide or unhide one of your pages from search engines, search for instructions\n| about modifying your page visibility in search engines on your hosting service, for example,\n| search for \"wix hide page from search engines\".\n\n\nTo update the rules in your existing robots.txt file, download a copy of your robots.txt file\nfrom your site and make the necessary edits.\n\nDownload your robots.txt file\n-----------------------------\n\n\nYou can download your robots.txt file various ways, for example:\n\n- Navigate to your robots.txt file, for example `https://example.com/robots.txt` and copy its contents into a new text file on computer. Make sure you follow the guidelines related to the [file format](/search/docs/crawling-indexing/robots/create-robots-txt#format_location) when creating the new local file.\n- Download an actual copy of your robots.txt file with a tool like cURL. For example: \n\n ```\n curl https://example.com/robots.txt -o robots.txt\n ```\n- Use the [robots.txt report](https://support.google.com/webmasters/answer/6062598) in Search Console to copy the content of your robots.txt file, which you can then paste into a file on your computer.\n\nEdit your robots.txt file\n-------------------------\n\n\nOpen the robots.txt file you downloaded from your site in a text editor and make the necessary\nedits to the rules. Make sure you use the\n[correct syntax](/search/docs/crawling-indexing/robots/create-robots-txt#create_rules)\nand that you save the file with UTF-8 encoding.\n\nUpload your robots.txt file\n---------------------------\n\n\nUpload your new robots.txt file to the root directory of your site as a text file named\nrobots.txt. The way you upload a file to your site is highly platform and server dependent. Check\nout our tips for finding help with\n[uploading a robots.txt file to your site](/search/docs/crawling-indexing/robots/create-robots-txt#upload).\n|\n| **If you do not have permission to upload files to the root directory of your site, contact\n| your domain manager to make changes.**\n|\n|\n| For example, if your site home page resides under\n| `subdomain.example.com/site/example/`, you likely cannot update the robots.txt file\n| at `subdomain.example.com/robots.txt`. In this case, contact the owner of\n| `example.com/` to make any necessary changes to the robots.txt file.\n\nRefresh Google's robots.txt cache\n---------------------------------\n\n\nDuring the automatic crawling process, Google's crawlers notice changes you made to your\nrobots.txt file and update the cached version every 24 hours. If you need to update the\ncache faster, use the **Request a recrawl** function of the\n[robots.txt report](https://support.google.com/webmasters/answer/6062598)."]]