Who’s in Charge of Multicore?

Like just about every other technology, multicore processors have an industry organization to help create best practices and guidelines. For developers, following The Multicore Association (MCA) can be a convenient way to keep up with what processor manufacturers, OS vendors, universities and other ecosystem members are planning a year or two out. MCA president Markus Levy recently spoke with Intelligence in Software about the organization’s current initiatives.

Q: What’s MCA’s role in the industry? For example, do you work with other trade groups and standards bodies?

Markus Levy: Multicore is a huge topic with many varieties of processors, issues and benefits. It boils down to the target market and, specifically, the target application to determine whether to use a homogeneous symmetrical multiprocessing processor or a highly integrated system-on-a-chip with many heterogeneous processing elements. The Multicore Association is and will be biting off a chunk of this to enable portability and ease of use.

With this in mind, we primarily aim to develop application program interfaces (APIs) to allow processor and operating system vendors and programmers to develop multicore-related products using open specifications. It’s important to note that The Multicore Association doesn’t currently work with other trade groups or standards bodies. We never intend to compete and/or develop redundant specifications.

Q: Developers work with whatever hardware vendors provide at any given time. In this case, that’s multicore processors. How can keeping up with MCA help developers understand what kind of hardware and software might be available to them in a year, three years or five years down the road? For example, what are some current association initiatives that they should keep an eye on?

M.L.: Good question. Notice that the MCA membership comprises a mixture of processor vendors, OS vendors and system developers. The main benefit for processor vendors is to support current and future generations of multicore processors. In other words, it makes it easier for their customers to write their application code once and only require minor changes as they move to another generation processor.

The OS vendors are also utilizing the MCA standards to enhance their offerings, and customers are requesting it. System developers are actually using our open standards to create their own proprietary implementations that are optimized for their needs.

We currently have a Tools Infrastructure Working Group (TIWG) that is defining a common data format and creating standards-based mechanisms to share data across diverse and non-interoperable development tools, specifically related to the interfaces between profilers and analysis/visualization tools. Actually, in this regard, the TIWG is also collaborating with the CE Linux Forum on a reference implementation for a de-facto trace data format standard that TIWG will define.

Our most popular specification to date is the Multicore Communications API (MCAPI), and the working group is currently defining and developing refinements and enhancements for version 2.0. MCAPI is now implemented by most OS vendors, quite a few university projects, as well as versions developed by system companies.

We’ve recently formed a Multicore Task Management API (MTAPI) working group that is focused on dynamic scheduling and mapping tasks to processor cores to help optimize throughput on multicore systems. MTAPI will provide an API that allows parallel embedded software to be designed in a straightforward way, abstracting the hardware details and letting the software developer focus on the parallel solution of the problem. This is already turning out to be quite popular with fairly extensive member involvement. It’s an important piece of the puzzle that many developers will be able to utilize in the next one to two years.

Q: Multicore processors have a lot of obvious advantages, particularly performance. But are there any challenges? The need for more parallelism seems like one. What are some others? And how is The Multicore Association working to address those challenges?

M.L.: There are many challenges of using multicore processors. Parallelizing code is just one aspect. Again, it also depends on the type of multicore processor and how it is being used. While the MTAPI is focused on the parallel solution, the MCAPI is critical to enable core-to-core communications, and our Multicore Resource management API (MRAPI) specification is focused on dealing with on-chip resources that are shared by two or more cores, such as shared memory and I/O.

Q: The MCA website has a lot of resources, such as webinars and a discussion group. Would you recommend those as a good way for developers to keep up with MCA activities?

M.L.: The webinars provide good background information on the projects we have completed so far, so I recommend these as starting points. The discussion group is very inactive. Better ways to stay up on activities include:

  • Subscribing to the MCA newsletter. This comes out every four to eight weeks, depending on whether or not there is news to be reported.
  • We also have special cases where nonmembers can attend meetings as guests. They can contact me for assistance.
  • Attend the Multicore Expo once a year, where members go into depth on the specifications. Plus, other industry folks present on various multicore technologies.

Photo Credit: @iStockphoto.com/4kodiak

The Long-term Commitment of Embedded Wireless

Most businesspeople replace their cell phone roughly every two years. At the other extreme is machine-to-machine (M2M) wireless devices, which often are deployed for the better part of a decade and sometimes even longer. That’s because it’s an expensive hassle for an enterprise to replace tens of thousands of M2M devices affixed to trucks, utility meters, alarm systems, point-of-sale terminals or vending machines, to name just a few places where today’s more than 62 million M2M devices reside.

Those long lifecycles highlight why it’s important for enterprises to take a long view when developing an M2M strategy. For example, if your M2M device has to remain in service for the next 10 years, it could be cheaper to pay a premium for LTE hardware now rather than go with less expensive gear that runs on GPRS or CDMA 1X, the networks of which might be phased out before the end of this decade.

Confused? Mike Ueland, North American vice president and general manager at M2M vendor Telit Wireless Solutions, recently spoke with Intelligence in Software about how CIOs, IT managers and developers can sleep at night instead of worrying about obsolescence and other pitfalls.

Q: M2M isn’t a new technology. For example, many utilities have been using it for more than a decade to track usage instead of sending out armies of meter readers every month. Why aren’t more enterprises following suit?

Mike Ueland: It’s very similar to what it was like before we had the Internet, Ethernet and things like that, where you had all of these disconnected devices. There’s an incredible opportunity, depending on what the business and application are, to connect those devices and bring more information back, as well as being able to provide additional value-added services.

There have been some significant improvements in terms of technology and the cost to deploy an M2M solution. All of the M2M solutions have gotten much more mature. There are so many more people in the ecosystem to support you.

But we haven’t seen the uptake within the enterprise community. Part of that is due to we’ve been in such a recessionary period over the past couple of years. No one really wants to start new projects.

Q: What are some pitfalls to avoid? For example, wireless carriers have to certify M2M devices before they’ll allow them on their network. How can enterprises avoid that kind of red tape and technical nitty-gritty?

M.U.: The mistake that we see a lot is that people try to bite off too much. They’ll say: “I need this custom device that needs to do this. Therefore I need to build a custom application.” They overcomplicate the solution.

There are so many off-the-shelf devices out there that can be quickly modified for your application. One of the benefits of that is you reduce technical risk and time to market because often those devices will be pre-certified on a carrier’s network. There are also a number of M2M application providers out there -- like Axeda, ILS and others -- that have an M2M architecture. That allows people to very quickly build their own application based on this platform.

Q: Price is another factor that’s kept a lot of enterprises out of M2M. How has that changed? For example, over the past few years, many wireless carriers have developed rate plans that make M2M more affordable.

M.U.: Across the board -- device costs, module costs, air-time costs -- all of these costs have come down probably by half in the past few years, if not more. So any business cases that were done two or three years ago are outdated.

In addition, there have been great technological improvements. For instance, the device sizes have continued to shrink. They use less power. So it opens it up for a whole range of applications that might not have been possible in the past.

Q: About 10 years ago, a lot of police departments and enterprises were scrambling to replace their M2M equipment because carriers were shutting down their CDPD networks . Today, those organizations have to make a similar choice: Do I deploy an M2M system that uses 2G or 3G and hope that those networks are still in service years from now? Or should I go ahead and use 4G now even though the hardware is expensive, coverage is limited and the bandwidth is far more than what I need?

M.U.: It depends on the application. For instance, AT&T is not encouraging new 2G designs. They’ve deployed a 3G network, and they’re starting to deploy a 4G network. They’d really like people to move in that direction. Verizon and Sprint have their equivalent version of 2G: CDMA 1X. Both Verizon and Sprint have publicly declared that those networks will be available through 2021. At the end of each year, they’re going to reevaluate that.

So depending on the planned time-length of your application and where you plan to deploy it -- North America or outside -- 2G networks will have a varying degree of longevity. Having said that, if you look at a 4G deployment outside of Verizon in the U.S., there are not many significant 4G deployments. We’re early days in 4G.

As module providers, we rely on lower-cost handset chipset pricing to drive M2M volumes. The cost of an LTE module is over $100 right now. Compare that to 2G, which is under $20 on average. It’s a big gap.