I guess because they've made them available to run locally (where OpenAI models might not in fact be heavier, their nonavailability makes them seem infinitely heavy, you can't run them yourself), and because their API pricing is much cheaper.