Yeah Harlan, that's a very good observation. And the pace of AI innovation like is increasing and not just in the big semi companies, but of course, in these system companies. And I think several announcements did come out, right, including, I think now Meta is public that Meta is designing a lot of silicon for AI, and of course, Google, Microsoft, Amazon. So all the big, really hyperscaler companies, along with Nvidia, AMD, Qualcomm, all the other kind of Samsung had AI phone this year. So I mean, there is a lot of acceleration both on the semi side and on the system side. And we are involved with all the major players there, and we are glad to provide our solutions. And I do think -- and this is the other thesis we have talked about for years now, right, five years, seven years that the system companies will do silicon because of a lot of reasons for customization, for schedule and supply chain control for cost benefits, if there is enough scale. And I think, the workload of AI, like if you look at I think some of the big hyperscaler and social media companies, they are talking about using like 20,000, 24,000 GPUs to train these new models. I mean this is immense amount. And then the size of the model and the number of models increased, so that could go to a much, much higher number than right now that is required to train these models and of course, to do inference on these models. So I think, we are still in the early innings in terms of system companies developing their own chips and at the same time, working with the semi companies. So I expect that to grow and those that -- our business with those system companies doing silicon, I would like to say is growing faster than Cadence average. But the good thing is the semi guys are also doing a lot of business. So I don't know, if that 45% will -- because that's a combination of a lot of companies. But overall, the AI and hyperscalers, they are doing a lot more than so are the big semi company.