I've seen large institutions spread large amounts of data via public ftp before, but it requires a lot of commitment and tends to get bogged down very quickly even then. And this is with stuff that only academics care about. I downloaded a terabyte dataset from a big company, and they used a time-limited user/pass generated on a per request basis. If mega doesn't work, you might try something like that (never revealing ip/user/pass to to the public).