It is fairly obvious why generative AI is going to increase the need for computing resources: training an algorithm requires identifying patterns and structures within any data set. And that can be quite complex, depending on the number of parameters to be sampled, and parameters can run into the billions.
In addition to the number of parameters, the sheer volume of data makes a difference, observers note. Specialized chatbots are less compute intensive while natural language translation can be quite compute intensive.
No comments:
Post a Comment