[robots.txt] How to configure robots.txt to allow everything?

If you want to allow every bot to crawl everything, this is the best way to specify it in your robots.txt:

User-agent: *
Disallow:

Note that the Disallow field has an empty value, which means according to the specification:

Any empty value, indicates that all URLs can be retrieved.


Your way (with Allow: / instead of Disallow:) works, too, but Allow is not part of the original robots.txt specification, so it’s not supported by all bots (many popular ones support it, though, like the Googlebot). That said, unrecognized fields have to be ignored, and for bots that don’t recognize Allow, the result would be the same in this case anyway: if nothing is forbidden to be crawled (with Disallow), everything is allowed to be crawled.
However, formally (per the original spec) it’s an invalid record, because at least one Disallow field is required:

At least one Disallow field needs to be present in a record.