Cohen moves on to the Automated Content Access Protocol (ACAP) which is supposed to replace the currently used robots.txt protocol (RTP). To an outsider, that seems quite flexible and restrictive as matters stand since it is already voluntarily accepted by the search engines. The search engines using RTP agree to be excluded from all or part of a web site. Reading between the lines, the ACAP advocates will want to exclude access for any search engine not paying up.
It is not clear how much clout the ACAP has. Its members, principally European, notably include AFP and AP and several book publishers but none of the large American or European papers as best as can be told from the ACAP website link here. They would like you to believe that their only interest is to benefit the consumer and they are coy about what their protocol would actually do, other than to apply "metadata to data."
Where "fair use" plays in all this is not clear, but be suspicious. Expect an attack in Congress to try to impose the ACAP on search engines, an ACAP with real teeth.