Flux has entered the image creation market
Flux is overrated. It is a huge model, at least compared to Stable Diffusion, which does allow it to produce "better" images. That is as long as you like very narrow depth of field and blurry images. It also does not have a negative prompt, which is what allowed Stable Diffusion based models to be so competitive with stuff like midjourney while being able to be run on something like a 3060.
There is also this huge double sided push to get AI off of consumer hardware, and into the cloud. On the video card side, Nvidia isn't making cards with more VRAM, and on the other side the AI companies are making their models just large enough that you cannot run them on consumer gpus and thus have to pay them for tokens. The price difference between a 4090 (24 gb for <$2000) (which CAN actually run flux and LLAMA 13B) and a business class gpu (RTX A6000 48 gb, for $5000-$6000k, or a A100 80 gb for ~$15000 which can run LLAMA 65B) is insane.
Flux has entered the image creation market
Flux is overrated. It is a huge model, at least compared to Stable Diffusion, which does allow it to produce "better" images. That is as long as you like very narrow depth of field and blurry images. It also does not have a negative prompt, which is what allowed Stable Diffusion based models to be so competitive with stuff like midjourney while being able to be run on something like a 3060.
There is also this huge double sided push to get AI off of consumer hardware, and into the cloud. On the video card side, Nvidia isn't making cards with more VRAM, and on the other side the AI companies are making their models just large enough that you cannot run them on consumer gpus and thus have to pay them for tokens. The price difference between a 4090 (24 gb for <$2000) (which CAN actually run flux and LLAMA 13B) and a business class gpu (RTX A6000 48 gb, for $5000-$6000k, or a A100 80 gb for ~$15000) is insane.
Flux has entered the image creation market
Flux is overrated. It is a huge model, at least compared to Stable Diffusion, which does allow it to produce "better" images. That is as long as you like very narrow depth of field and blurry images. It also does not have a negative prompt, which is what allowed Stable Diffusion based models to be so competitive with stuff like midjourney while being able to be run on something like a 3060.
There is also this huge double sided push to get AI off of consumer hardware, and into the cloud. On the video card side, Nvidia isn't making cards with more VRAM, and on the other side the AI companies are making their models just large enough that you cannot run them on consumer gpus. The price difference between a 4090 (24 gb for <$2000) (which CAN actually run flux and LLAMA 13B) and a business class gpu (RTX A6000 48 gb, for $5000-$6000k, or a A100 80 gb for ~$15000) is insane.
Flux has entered the image creation market
Flux is overrated. It is a huge model, at least compared to Stable Diffusion, which does allow it to produce "better" images. That is as long as you like very narrow depth of field and blurry images. It also does not have a negative prompt, which is what allowed Stable Diffusion based models to be so competitive with stuff like midjourney while being able to be run on something like a 3060.