This is grand, I am sold! CUDA monopoly will start waning and AMD will soon have the hardware for which everyone is dying for.
Contra point: LLM training is not for small players and the capability difference will get only bigger. Google and Azure will be (are?) the only ones who can afford to buy hardware required for training an LLM (not fine tuning).
I have personally used GCP’s vertex AI’s generative AI and its very simple and superb. I don’t think smaller companies should even think of buying all this hardware to ultimately train a relatively small LLM with limited capability instead of using GCP’s or Azure’s AI offering. I feel its a real possibility than in a year or two everybody realises this which will reduce the demand.
disc: have position in AMD since months, added recently.
Subscribe To Our Free Newsletter |