1. Create a txt file named robots.txt
2. Add the following lines
User-agent: *
Disallow:/Documents/*.zip&providername=NetworkDocuments
Disallow:/Documents/*.dll&providername=NetworkDocuments
Disallow:/Documents/*.exe&providername=NetworkDocuments
Disallow:/Documents/*.mp4&providername=NetworkDocuments
Disallow:/Documents/*.png&providername=NetworkDocuments
Disallow:/Documents/*.config&providername=NetworkDocuments
Disallow:/Documents/*.msi&providername=NetworkDocuments
Disallow:/Documents/*.msi&providername=NetworkDocuments
Disallow:/Documents/*.bat&providername=NetworkDocuments
Disallow:/Documents/*.bak&providername=NetworkDocuments
Disallow:/Documents/*.msi&providername=NetworkDocuments
Disallow:/Documents/*.tmp&providername=NetworkDocuments
Disallow:/Documents/*.css&providername=NetworkDocuments
Disallow:/Documents/*.ico&providername=NetworkDocuments
Disallow:/Documents/*.svg&providername=NetworkDocuments
Disallow:/Documents/*.gif&providername=NetworkDocuments
Disallow:/Documents/*.jpg&providername=NetworkDocuments
Disallow:/Documents/*.jpeg&providername=NetworkDocuments
Disallow:/Documents/*.inf&providername=NetworkDocuments
Disallow:/Documents/*.xml&providername=NetworkDocuments
3. How it works:
The following line tells the indexer to ignore .zip files
Disallow:/Documents/*.zip&providername=NetworkDocuments
If you want the indexer to ignore .doc files you would change .zip to .doc and add the line to the robots.txt file
Disallow:/Documents/*.doc&providername=NetworkDocuments
4. Put the robots.txt file in the root web folder on the webserver. For example, the root folder for my portal is C:\inetpub\wwwroot\DavidW2\ Yours is probably C:\inetpub\wwwroot\Production Portal\ or something similar.
5. Run the indexer or wait for it to run when scheduled. You can check when it is scheduled by checking Task Scheduler on the webserver.
To limit folders see:
Article Attachments:
Comments
0 comments
Article is closed for comments.