The way DTSearch works in the portal now is that it will index all documents in the portal, and then rely on the portal to filter based on permissions.
A use-case scenario where one wouldn't want this is if there are private folders for workgroups that you absolutely want to make sure no one is accessing through document search and is only getting through that workgroup's pages.
To setup a Robots.txt file, create a file named Robots.txt in the root of the install folder.
Add the following text to it:
Then add a series of lines, one for each folder you want to exclude. For network documents, they look like this:
Where the 5 at the end is the ID in SQL for the Mapped Network folder.