Machine learning has made remarkable strides in recent years, with models matching human capabilities in various tasks. However, the main hurdle lies not just in training these models, but in utilizing them efficiently in everyday use cases. This is where inference in AI comes into play, arising as a primary concern for researchers and tech leaders