So look, when we look at the overall DRAM demand, the DRAM TAM, of course, the AI is driving growth. Automotive, certainly driving growth. Other end markets, such as we mentioned, mobile and PC, in terms of -- or consumer, in terms of their end demand, has been somewhat lackluster. The AI demand that is driven in data center, whether it is in the enterprise definitely drives healthy trends for memory growth. Yes, enterprise server and some of the data center demand has been recently somewhat impacted by the macro trends, but the trend of AI and more memory is absolutely continuing. And that's what -- when we look at our overall 2023 demand growth and the projections of CAGR that we have ahead of us, we have taken those into account. This is very, very early innings for AI, and AI is really pervasive. It's everywhere in, of course, cloud applications, enterprise server applications, applications such as generative AI would be in enterprises too. Because due to confidentiality of data, enterprises will be building their own large language models. And as you know, while the enterprise large language models may not be as large as the large language models you may see, and examples such as super clusters, et cetera, but all of them are really tending towards greater number of parameters. Now we are talking about parameters with generative AI getting into even trillion parameter range. Not too long ago, these used to be in 100 millions of range. That requires more memory. So regardless of the applications, whether it is on the enterprise side or on the cloud server side, the memory requirements are continuing to increase. And I'll just point out that 6x to 8x that we have mentioned is the multiple of DRAM requirement in AI server versus standard server. And of course, as we highlighted in the script, there are many compute configurations, such as the supercluster example that we gave you, where the DRAM content that is required is few hundred times higher than a standard server. So really, I think the journey here ahead of us will be very exciting. And when we look at machine-to-machine communication, when we look at opportunities for the virtuous cycle for the ever-increasing data that training applications, that inferencing at scale and various edge applications, including automotive, are driving the requirements for memory and storage will continue to grow well, and Micron is going to be well positioned with our products. And we consider 2024 to be a big banner year for AI, for memory and storage. And Micron will be well positioned to capture this with our strong portfolio of products from D5 to LP5 to HBM to high-density modules, even including graphics.