@admitsWrongIfProven AI models that are less costly in compute and bandwidth to train, in compute and memory to run, that can be run locally on consumer hardware rather than via API in the cloud. New Chinese models like DeepSeek and Bytedance’s are examples. But other, prior models are runnable on consumer hardware too. If API pricing reflects cost, though, the new Chinese models are lighter. I’d like to see a global competitive race to get lighter!