I just worked out a PostgreSQL-specific option for this. It's a bit of a hack, and comes with its own pros and cons and limitations, but it seems to work and isn't limited to a specific development language, platform, or PG driver.
The trick of course is to find a way to pass an arbitrary length collection of values as a single parameter, and have the db recognize it as multiple values. The solution I have working is to construct a delimited string from the values in the collection, pass that string as a single parameter, and use string_to_array() with the requisite casting for PostgreSQL to properly make use of it.
So if you want to search for "foo", "blah", and "abc", you might concatenate them together into a single string as: 'foo,blah,abc'. Here's the straight SQL:
select column from table
where search_column = any (string_to_array('foo,blah,abc', ',')::text[]);
You would obviously change the explicit cast to whatever you wanted your resulting value array to be -- int, text, uuid, etc. And because the function is taking a single string value (or two I suppose, if you want to customize the delimiter as well), you can pass it as a parameter in a prepared statement:
select column from table
where search_column = any (string_to_array($1, ',')::text[]);
This is even flexible enough to support things like LIKE comparisons:
select column from table
where search_column like any (string_to_array('foo%,blah%,abc%', ',')::text[]);
Again, no question it's a hack, but it works and allows you to still use pre-compiled prepared statements that take *ahem* discrete parameters, with the accompanying security and (maybe) performance benefits. Is it advisable and actually performant? Naturally, it depends, as you've got string parsing and possibly casting going on before your query even runs. If you're expecting to send three, five, a few dozen values, sure, it's probably fine. A few thousand? Yeah, maybe not so much. YMMV, limitations and exclusions apply, no warranty express or implied.
But it works.