when i check my site an error occured= see below
#
# This file contains rules to prevent the crawling and indexing of certain parts
# of your web site by spiders of a major search engines likes Google and Yahoo.
# By managing these rules you can allow or disallow access to specific folders
# and files for such spyders.
# The good way to hide private data or save a lot of bandwidth.
#
#
# For more information about the robots.txt standard, see:
# http://www.robotstxt.org/wc/robots.html
#
# For syntax checking, see:
# http://www.sxw.org.uk/computing/robots/check.html
User-agent: *
#Files
Disallow: ow_version.xml
Disallow: INSTALL.txt
Disallow: LICENSE.txt
Disallow: README.txt
Disallow: UPDATE.txt
Disallow: CHANGES.txt
# URLs
Disallow: /admin/