Google FeedBurner is phasing out its RSS-to-email subscription service. While we are currently working on the implementation of a new system, you may experience an interruption in your email subscription service.
Please stay tuned for further communications.
Get email delivery of the Cadence blog featured here
In my predictions for 2018 I had identified five key trends driving verification in 2018 – Security, Safety, Application Specificity, Processor Ecosystems and System Design Enablement, all centered around ecosystems. Looking back now as the year draws to a close, the key verification highlights that happened in 2018 indeed fit into these 5 categories, but there are also some surprises.
Image source: BigStockPhoto
First, on security and safety, this year indeed saw significant progress. Functional safety has become a huge topic especially in automotive but extends into other areas like medical and aero/defense application domains very fast. Cadence showcased usage by ROHM in a press release and at CDNLive Japan, as well as our work with ARM is this domain throughout 2018. Verification planning – a topic that we have been pushing for years with our Metric Driven Verification approaches - has started to extend in 2018 into three different areas: Classic “functional verification planning” is joined by “safety verification planning” with all the aspects around FMEDA (Failure modes, effects, and diagnostic analysis), and next up is “security verification planning”. Following Arm’s Simon Segar’s call to action at Arm TechCon 2017 to “put hackers out of business”, we saw a flurry of activities around Arm’s PSA (Platform Security Architecture) and Cadence showcased how we partner with companies like Tortuga Logic to connect with our Verification Suite at CDNLive and DAC. The key aspect to remember here, likely a key trend going forward, are threat models at several levels of abstraction – from RTL to system and software level – representing as early as possible the path that hackers could use to attack chip and system security, allowing verification to check proper counter measures.
Also, Application Specificity has as I had predicted continued to drive very domain specific verification requirements.
The 5G/Networking space drives very unique verification aspects. Getting 10Gbps data-rates, a million devices per square kilometer and 1ms latency to work out together is far from trivial. The trends in 2018 show that likely the high data rates will be the first focus (jay Netflix!) but the 5G verification challenges, especially for architecture analysis, remain tough and require extensive tool support for the right protocols. And short of improving the speed of light, the only way to achieve 1ms latencies seems to be to achieve data locality. In that context, “Edge Computing” may well be 2018th tech term of the year and has spawned a flurry of design activities. The intensity with which activities have spawned in this domain, for instance at ArmTechCon, certainly surprised me.
In the aero/defense space, DARPA has started the ERI program and spawned a lot of research activities, including Cadence activities in the area of AI/ML for EDA. Verification of Systems of Systems is a big topic, with emulation square and center as shown in presentations from AFRL and Cadence at GOMAC 2018, and emulation requirements were a big topic at the DARPA ERI forum.
The area of AI/ML feels to me like the area of graphics development back in the mid 90’s, only bigger. Lots of start-up funding worldwide, especially in China, has further made hardware assisted verification – emulation and prototyping – a key necessity for these complex designs. Having an emulator that deals well with large capacity designs, offers flexible and efficient debug and optimized throughput puts us in a good spot here. Like IoT, this area overlaps other application domains quite a bit – like ADAS in automotive, as well as applications in medical, networking and others.
The Server space saw further movements with Arm servers making significant inroads. It changes the datacenter aspects quite a bit with specialized workloads, and we at Cadence have announced availability of our tools on Arm based servers too. Eco-systems play a key role here as Arm announced the Server Ready program as an eco-system effort, for which for instance portable stimulus and emulation are used extensively as we announced at Arm Techcon. Expect more interesting news here in the years to come.
Not an application in itself, spanning across application domains, IoT continued to be a key buzz word for the industry in 2018. Mixed signal verification is key for these relatively small designs (if they are edge nodes), and Cloud plays a key role here. For small designs users don’t want to have to build up all the EDA infrastructure in-house, so cloud activities like the one we have with Arm as part of the Design Start Program play here. But Cloud becomes important for big server class designs too - as announced in the Cadence Q2’18 earnings call that “Ampere Computing chose Palladium Z1 for the development of their next-generation ARM-based server chip. Palladium Z1 was chosen for its scalability for the large designs, state of the art debugging features, stability and availability over the cloud. Clouds everywhere, as also announced at DAC!
Processor Ecosystems did prove very important, perhaps even a bit more prominent as I had thought. My colleague Paul McLellan has covered this area nicely, most recently here. As stated in Paul’s blog, Qualcomm now also disclosed that it will be shipping RISC-V in a high-volume product in 2019. With others like Western Digital and NVIDIA already on the RISC V train, joint by several commercial IP Providers driving, the area of processor ecosystems has been most interesting to watch. Alliances are developing fast and friends on one day may be foes on the next, as I described here regarding the announcements at Arm TechCon.
Last but most certainly not least, System Design Enablement has made huge strides in 2018, as could be seen at the Design Automation Conference in June. New alliances and ecosystems are forming in this area. Mathworks and National Instruments come to mind as key partners, helping verification and EDA extending into that domain. And of course virtualization has grown to be key to verification, both in augmenting physical interfaces for hardware engines, as well as adding higher levels of abstraction with virtual platforms, allowing the industry to enable “shift left” for earlier hardware/software integration.
All in all, it turns out that my biggest mis-prediction from last year was that HBO’s ‘Game of Thrones’ would come to a conclusion in 2018. Not sure what I was thinking. The intensity around AI/ML chips was bigger in 2018 than I expected, and Edge Computing has pushed much more intensively than I would have thought. But the rest of my predictions was pretty much right on.