Quantcast
Channel: QNX Auto Blog
Viewing all 233 articles
Browse latest View live

TWICE applauds QNX OS-powered OnStar 4G LTE with VIP Award

$
0
0

Megan Alink
Our customers are all VIPs, and we love nothing more than seeing them shine with industry recognition. Recently, TWICE named the OnStar 4G LTE powered by the QNX Neutrino OS to its list of Very Important Product (VIP) Award winners in the in-dash navigation multimedia receivers category.

The product builds a Wi-Fi hotspot into the vehicle so customers can stay online easily while they’re on the go. Up to seven devices, including computers, smartphones, video game consoles and tablets, can be paired to the hotspot for use any time the car is on. OnStar 4G LTE also gives customers access to the same features that OnStar is known for, including emergency assistance, security, navigation and vehicle diagnostics.

Congratulations to our customer OnStar and the rest of the TWICE VIPs! You can view the full list of categories and winners on the TWICE website.


One OS, multiple safety applications

$
0
0
The latest version of our certified OS for ADAS systems and digital instrument clusters has a shorter product name — but a longer list of talents.

Paul Leroux
Can you ever deliver a safety-critical product to a customer and call it a day? For that matter, can you deliver any product to a customer and call it a day? These, of course, are rhetorical questions. Responsibility for a product, software or otherwise, rarely ends when you release it, especially when you add safety to the mix. In that case, it’s a long-term commitment that continues until the last instance of the product is retired from service. Which can take decades.

Mind you, people dedicated to building safety-critical products aren’t prone to sitting on their thumbs. From their perspective, product releases are simply milestones in a process of ongoing diligence and product improvement. For instance, at QNX Software Systems, we subject our OS safety products to continual impact analysis, even after they have been independently certified for use in functional safety systems. If that analysis calls for improved product, then improved product is what we deliver. With a refreshed certificate, of course.

Which brings me to the QNX OS for Safety. It’s a new — and newly certified — release of our field-proven OS safety technology, with a twist. Until now, we had one OS certified to the ISO 26262 standard (for automotive systems) and another certified to the IEC 61508 standard (for general embedded systems). The new release is certified to both of these safety standards and replaces the two existing products in one fell swoop.

So if you no longer see the QNX OS for Automotive Safety listed on the QNX website, not to worry. We’ve simply replaced it with an enhanced version that has a shorter product name and broader platform support — all with the same proven technology under the hood. (My colleague Patryk Fournier has put together an infographic that nicely summarizes the new release; see sidebar).

And if you’re at all surprised that a single OS can be certified to both 61508 and 26262, don’t be. As the infographic suggests, IEC 61508 provides the basis for many market-specific standards, including IEC 62304, EN 5012x, and, of course, ISO 26262.

Learn more about the QNX OS for Safety on the QNX website. And for more information on ISO 26262 and how it affects the design of safety-critical automotive systems, check out these whitepapers:


From ADAS to autonomous

$
0
0
A new webinar on how autonomous driving technologies will affect embedded software — and vice versa

Paul Leroux
When, exactly, will production cars become fully autonomous? And when will they become affordable to the average Jane or Joe? Good questions both, but in the meantime, the auto industry isn’t twiddling its collective thumbs. It’s already starting to build a more autonomous future through active-control systems that can avoid accidents (e.g. automated emergency braking) and handle everyday driving tasks (e.g. adaptive cruise control).

These systems rely on software to do their job, and that reliance will grow as the systems become more sophisticated and cars become more fully autonomous. This trend, in turn, will place enormous pressure on how the software is designed, developed, and maintained. Safety, in particular, must be front and center at every stage of development.

Which brings me to a new webinar from my inestimable colleague, Kerry Johnson. Titled “The Role of a Software Platform When Transitioning from ADAS to Autonomous Driving,” the webinar will examine:
  • the emergence of high-performance systems-on-chip that target ADAS and autonomous vehicle applications
  • the impact of increasing system integration and autonomous technologies on embedded software
  • the need for functional safety standards such as ISO 26262
  • the emergence of pre-certified products as part of the solution to address safety challenges
  • the role of a software platform to support the evolution from ADAS to autonomous driving

If you are tasked with either developing or sourcing software for functional safety systems in passenger vehicles, this webinar is for you. Here are the coordinates:

Wednesday, October 7
1:00pm EDT

Registration Site



Developing safety-critical systems? This book is for you

$
0
0
In-depth volume covers development of systems under the IEC 61508, ISO 26262, EN 50128, and IEC 62304 standards

Paul Leroux
In June, I told you of an upcoming book by my colleague Chris Hobbs, who works as a software safety specialist here at QNX Software Systems. Well, I’m happy to say that the book is now available. It’s called Embedded Software Development for Safety-Critical Systems and it explores design practices for building medical devices, railway control systems, industrial control systems, and, of course, automotive ADAS devices.

The book:
  • covers the development of safety-critical systems under ISO 26262, IEC 61508, EN 50128, and IEC 62304
  • helps developers learn how to justify their work to external auditors
  • discusses the advantages and disadvantages of architectural and design practices recommended in the standards, including replication and diversification, anomaly detection, and so-called “safety bag” systems
  • examines the use of open-source components in safety-critical systems
Interested? I invite to you to visit the CRC Press website, where you can view the full Table of Contents and, of course, order the book.

Looking forward to getting my copy!

A low-down look at the QNX concept cars

$
0
0
Paul Leroux
It’s that time of year again. The QNX concept team has set the wheels in motion and started work on a brand new technology concept car, to be unveiled at CES 2016.

The principle behind our technology concept cars is simple in theory, but challenging in practice: Take a stock production vehicle off the dealer’s lot, mod it with new software and hardware, and create user experiences that make driving more connected, more enjoyable, and, in some cases, even safer.

It’s always fun to guess what kind of car the team will modify. But the real story lies in what they do with it. In recent years, they’ve implemented cloud-based diagnostics, engine sound enhancement, traffic sign recognition, collision warnings, speed alerts, natural voice recognition — the list goes on. There’s always a surprise or two, and I intend to keep it that way, so no hints about the new car until CES. ;-)

In the meantime, here is a retrospective of QNX technology concept cars, past and present. It’s #WheelWednesday, so instead of the usual eye candy, I’ve chosen images to suit the occasion. Enjoy.

The Maserati Quattroporte GTS
From the beginning, our technology concept cars have demonstrated how the QNX platform helps auto companies create connected (and compelling) user experiences. The Maserati, however, goes one step further. It shows how QNX can enable a seamless blend of infotainment and ADAS technologies to simplify driving tasks, warn of possible collisions, and enhance driver awareness. The car can even recommend an appropriate speed for upcoming curves. How cool is that?




The Mercedes CLA 45 AMG
By their very nature, technology concept cars have a short shelf life. The Mercedes, however, has defied the odds. It debuted in January 2014, but is still alive and well in Europe, and is about to be whisked off to an event in Dubai. The car features a multi-modal user experience that blends touch, voice, physical buttons, and a multi-function controller, enabling users to interact naturally with infotainment functions. The instrument cluster isn’t too shabby, either. It will even warn you to ease off the gas if you exceed the local speed limit.




The Bentley Continental GT
I dubbed our Bentley the “ultimate show-me car,” partially because that’s exactly what people would ask when you put them behind the wheel. The digital cluster was drop-dead gorgeous, but the head unit was the true pièce de résistance — an elegantly curved 17” high-definition display based on TI’s optical touch technology. And did I mention? The car’s voice rec system spoke with an English accent.




The Porsche 911 Carrera
Have you ever talked to a Porsche? Well, in this case, you could — and it would even talk back. We outfitted our 911 with cloud-based voice recognition (so you could control the nav system using natural language) and text-to-speech (so you could listen to incoming BBMs, emails, and text messages). But my favorite feature was one-touch Bluetooth pairing: you simply touched your phone to an NFC reader in the center console and, hey presto, the phone and car were automatically paired,




The Chevrolet Corvette
I have a confession to make: The Corvette is the only QNX technology concept car that I got to drive around the block. For some unfathomable reason, they never let me drive another one. Which is weird, because I saw the repair bill, and it wasn’t that much. In any case, the Corvette served as the platform for the very first QNX technology concept car, back in 2010. It included a reconfigurable instrument cluster and a smartphone-connected head unit — features that would become slicker and more sophisticated in our subsequent concept vehicles. My favorite feature: the reskinnable UI.




The Jeep Wrangler
Officially, the Wrangler serves as the QNX reference vehicle, demonstrating what the QNX CAR Platform can do out of the box. But it also does double-duty as a concept vehicle, showing how the QNX platform can help developers build leading-edge ADAS solutions. My favorite features: in-dash collision warnings and a fast-booting backup display.



Well, there you have it. In just a few months’ time, we will have the honor of introducing you to a brand new QNX technology concept car. Any guesses as to what the wheels will look like?



If you liked this post, you may also be interested in...The lost concept car photos

The ethics of robot cars

$
0
0
“By midcentury, the penetration of autonomous vehicles... could ultimately cause vehicle crashes in the U.S. to fall from second to ninth place in terms of their lethality ranking.” — McKinsey

Paul Leroux
If you saw a discarded two-by-four on the sidewalk, with rusty nails sticking out of it, what would you do? Chances are, you would move it to a safe spot. You might even bring it home, pull the nails out, and dispose of it properly. In any case, you would feel obliged to do something that reduces the probability of someone getting hurt.

Driver error is like a long sharp nail sticking out of that two-by-four. It is, in fact, the largest single contributor to road accidents. Which raises the question: If the auto industry had the technology, skills, and resources to build vehicles that could eliminate accidents caused by human error, would it not have a moral obligation to do so? I am speaking, of course, of self-driving cars.

Now, a philosopher I am not. I am ready to accept that my line of thinking on this matter has more holes than Swiss cheese. But if so, I’m not the only one with Emmenthal for brain matter. I am, in fact, in good company.

Take, for example, Bryant Walker-Smith, a professor in the schools of law and engineering at the University of South Carolina. In an article in MIT Technology Review, he argues that, given the number of accidents that involve human error, introducing self-driving technology too slowly could be considered unethical. (Mind you, he also underlines the importance of accepting ethical tradeoffs. We already accept that airbags may kill a few people while saving many; we may have to accept that the same principle will hold true for autonomous vehicles.)

Then there’s Roger Lanctot of Strategy Analytics. He argues that government agencies and the auto industry need to move much more aggressively on active-safety features like automated lane keeping and automated collision avoidance. He reasons that, because the technology is readily available — and can save lives — we should be using it.

Mind you, the devil is in the proverbial details. In the case of autonomous vehicles, the ethics of “doing the right thing” is only the first step. Once you decide to build autonomous capabilities into a vehicle, you often have to make ethics-based decisions as to how the vehicle will behave.

For instance, what if an autonomous car could avoid a child running across the street, but only at the risk of driving itself, and its passengers, into a brick wall? Whom should the car be programmed to save? The child or the passengers? And what about a situation where the vehicle must hit either of two vehicles — should it hit the vehicle with the better crash rating? If so, wouldn’t that penalize people for buying safer cars? This scenario may sound far-fetched, but vehicle-to-vehicle (V2X) technology could eventually make it possible.

The “trolley problem” captures the dilemma nicely:



Being aware of such dilemmas gives me more respect for the kinds of decisions automakers will have to make as they build a self-driving future. But you know what? All this talk of ethics brings something else to mind. I work for a company whose software has, for decades, been used in medical devices that help save lives. Knowing that we do good in the world is a daily inspiration — and has been for the last 25 years of my life. And now, with products like the QNX OS for Safety, we are starting to help automotive companies build ADAS systems that can help mitigate driver error and, ultimately, reduce accidents. So I’m doubly proud.

More to the point, I believe this same sense of pride, of helping to make the road a safer place, will be a powerful motivator for the thousands of engineers and development teams dedicated to paving the road from ADAS to autonomous. It’s just one more reason why autonomous cars aren’t a question of if, but only of when.

What does a decades-old thought experiment have to do with self-driving cars?

$
0
0
Paul Leroux
Last week, I discussed, ever so briefly, some ethical issues raised by autonomous vehicles — including the argument that introducing them too slowly could be considered unethical!

My post included a video link to the trolley problem, a thought experiment that has long served as a tool for exploring how people make ethical decisions. In its original form, the trolley problem is quite simple: You see a trolley racing down a track on which five people are tied up. Next to you is a lever that can divert the trolley to an empty track. But before you can pull the lever, you notice that someone is, in fact, tied up on the second track. Do you do nothing and let all 5 people die, or do you pull the lever and kill the one person instead?

The trolley problem has undergone criticism for failing to represent real-world problems, for being too artificial. But if you ask Patryk Lin, a Cal Tech professor who has delivered talks to Google and Tesla on the ethics of self-driving cars, it can serve as a helpful teaching tool for automotive engineers — especially if its underlying concept is framed in automotive terms.

Here is how he presents it:

“You’re driving an autonomous car in manual mode—you’re inattentive and suddenly are heading towards five people at a farmer’s market. Your car senses this incoming collision, and has to decide how to react. If the only option is to jerk to the right, and hit one person instead of remaining on its course towards the five, what should it do?”

Of course, autonomous cars, with their better-than-human driving habits (e.g. people tailgate, robot cars don’t) should help prevent such difficult situations from happening in the first place. In the meantime, thinking carefully through this and other scenarios is just one more step on the road to building fully autonomous, and eventually driverless, cars.

Read more about the trolley problem and its application to autonomous cars in a recent article on The Atlantic.

Speaking of robot cars, if you missed last week's webinar on the role of software when transitioning from ADAS to autonomous driving, don't sweat it. It's now available on demand at Techonline.

ADAS: The ecosystem's next frontier

$
0
0
At DevCon last week, Renesas showcased their ADAS concept vehicle. It was just what you would expect from an advanced demonstration, combining radar, lidar, cameras, V2X, algorithms, multiple displays and a huge amount of software to make it all work. They were talking about sensor fusion and complete surround view and, well, you get the picture.

What isn’t readily obvious as you experience the demo is the investment made and the collaboration required by Renesas and their ADAS ecosystem.

Partnership is a seldom recognized cornerstone of what will ultimately become true sensor fusion. It seems, to me at least, unlikely that anyone will be able to develop the entire system on their own. As processors become more and more powerful, the discrete ECUs will start to collapse into less distributed architectures with much more functionality on each chip. The amount of data coming into and being transmitted by the vehicle will continue to grow and the need to secure it will grow alongside. V2X, high definition map data, algorithms, specialized silicon, vision acceleration and more will become the norm in every vehicle.

How about QNX Software Systems? Are we going to do all of this on our own? I doubt it. Instead, we will continue to build on the same strategy that has helped take us to a leadership position in the infotainment market: collaborating with best of breed companies to deliver a solution on a safety-certified foundation that customers can leverage to differentiate their products.

The view from above at Renesas DevCon.

Five reasons why they should test autonomous cars in Ontario

$
0
0
Did I say five? I meant six…

Paul Leroux
It was late and I needed to get home. So I shut down my laptop, bundled myself in a warm jacket, and headed out to the QNX parking lot. A heavy snow had started to fall, making the roads slippery — but was I worried? Not really. In Ottawa, snow is a fact of life. You learn to live with it, and you learn to drive in it. So I cleared off the car windows, hopped in, and drove off.

Alas, my lack of concern was short-lived. The further I drove, the faster and thicker the snow fell. And then, it really started to come down. Pretty soon, all I could see out my windshield was a scene that looked like this, but with even less detail:



That’s right: a pure, unadulterated whiteout. Was I worried? Nope. But only because I was in a state of absolute terror. Fortunately, I could see the faintest wisp of tire tracks immediately in front of my car, so I followed them, praying that they didn’t lead into a ditch, or worse. (Spoiler alert: I made it home safe and sound.)

Of course, it doesn’t snow every day in Ottawa — or anywhere else in Ontario, for that matter. That said, we can get blanketed with the white stuff any time from October until April. And when we do, the snow can play havoc with highways, railways, airports, and even roofs.

Roofs, you say? One morning, a few years ago, I heard a (very) loud noise coming from the roof of QNX headquarters. When I looked out, this is what I saw — someone cleaning off the roof with a snow blower! So much snow had fallen that the integrity of the roof was being threatened:



When snow like this falls on the road, it can tax the abilities of even the best driver. But what happens when the driver isn’t a person, but the car itself? Good question. Snow and blowing snow can mask lane markers, cover street signs, and block light-detection sensors, making it difficult for an autonomous vehicle to determine where it should go and what it should do. Snow can even trick the vehicle into “seeing” phantom objects.

And it’s not just snow. Off the top of my head, I can think of 4 other phenomena common to Ontario roads that pose a challenge to human and robot drivers alike: black ice, freezing rain, extreme temperatures, and moose. I am only half joking about the last item: autonomous vehicles must respond appropriately to local fauna, not least when the animal in question weighs half a ton.

To put it simply, Ontario would be a perfect test bed for advancing the state of autonomous technologies. So imagine my delight when I learned that the Ontario government has decided to do something about it.

Starting January 1, Ontario will become the first Canadian province to allow road testing of automated vehicles and related technology. The provincial government is also pledging half a million dollars to the Ontario Centres of Excellence Connected Vehicle/Automated Vehicle Program, in addition to $2.45 million already provided.

The government has also installed some virtual guard rails. For instance, it insists that a trained driver stay behind the wheel at all times. The driver must monitor the operation of the autonomous vehicle and take over control whenever necessary.

Testing autonomous vehicles in Ontario simply makes sense, but not only because of the weather. The province also has a lot of automotive know-how. Chrysler, Ford, General Motors, Honda, and Toyota all have plants here, as do 350 parts suppliers. Moreover, the province has almost 100 companies and institutions involved in connected vehicle and automated vehicle technologies — including, of course, QNX Software Systems and its parent company, BlackBerry.

So next time you’re in Ontario, take a peek at the driver in the car next to you. But don’t be surprised if he or she isn’t holding the steering wheel.


A version of this post originally appeared in Connected Car Expo blog.

An ADAS glossary for the acronym challenged

$
0
0
If you’ve got ACD, you’ve come to the right place.

Paul Leroux
Someday, in the not-so-distant future, your mechanic will tell you that your CTA sensor has gone MIA. Or that your EDA needs an OTA update. Or that the camera system for your PLD has OSD. And when that day happens, you’ll be glad you stumbled across this post. Because I am about to point you to a useful little glossary that takes the mystery out of ADAS acronyms. (The irony being, of course, that ADAS is itself an acronym.)

Kidding aside, acronyms can stand in the way of clear communication — but only when used at the wrong time and place. Otherwise, they serve as useful shorthand, especially among industry insiders who have better things to do than say “advanced driver assistance system” 100 times a day when they can simply say ADAS instead.

In any case, you can find the glossary here. And when you look at it, you’ll appreciate my ulterior motive for sharing the link — to demonstrate that the ADAS industry is moving apace. The glossary makes it abundantly clear that the industry is working on, or has already developed, a large variety of ADAS systems. The number will only increase, thanks to government calls for vehicle safety standards, technology advances that make ADAS solutions more cost-effective, and growing consumer interest in cars that can avoid crashes. In fact, Visiongain has estimated that the global ADAS market will experience double-digit growth between 2014 and 2024, from a baseline estimate of $18.2 billion.

And in case you’re wondering, ACD stands for acronym challenged disorder. ;-)

Bringing a bird’s eye view to a car near you

$
0
0
QNX and TI team up to enable surround-view systems in mass-volume vehicles

Paul Leroux
Uh-oh. You are 10 minutes late for your appointment and can’t find a place to park. At long last, a space opens up, but sure enough, it’s the parking spot from hell: cramped, hard to access, with almost no room to maneuver.

Fortunately, you’ve got this covered. You push a button on your steering wheel, and out pops a camera drone from the car’s trunk. The drone rises a few feet and begins to transmit a bird’s eye view of your car to the dashboard display — you can now see at a glance whether you are about to bump into curbs, cars, concrete barriers, or anything else standing between you and parking nirvana. Seconds later, you have backed perfectly into the spot and are off to your meeting.

Okay, that’s the fantasy. In reality, cars with dedicated camera drones will be a long time coming. In the meantime, we have something just as good and a lot more practicable — an ADAS application called surround view.

Getting aligned
Approaching an old problem from a
new perspective
. Credit: TI
Surround-view systems typically use four to six fisheye cameras installed at the front, back, and sides of the vehicle. Together, these cameras capture a complete view of the area around your car, but there’s a catch: the video frames they generate are highly distorted. So, to start, the surround-view system performs geometric alignment of every frame. Which is to say, it irons all the curves out.

Next, the system stitches the corrected video frames into a single bird’s eye view. Mind you, this step isn’t simply a matter of aligning pixels from several overlapping frames. Because each camera points in a different direction, each will generate video with unique color balance and brightness levels. Consequently, the system must perform photometric alignment of the image. In other words, it corrects these mismatches to make the resulting output look as if it were taken by a single camera hovering over the vehicle.

Moving down-market
If you think that all this work takes serious compute power, you’re right. The real trick, though, is to make the system affordable so that luxury car owners aren’t the only ones who can benefit from surround view.

Which brings me to QNX Software Systems’ support for TI’s new TDA2Eco system-on-chip (SoC), which is optimized for 3D surround view and park-assist applications. The TDA2Eco integrates a variety of automotive peripherals, including CAN and Gigabit Ethernet AVB, and supports up to eight cameras through parallel, serial and CSI-2 interfaces. To enable 3D viewing, the TDA2Eco includes an image processing accelerator for decoding multiple camera streams, along with graphics accelerators for rendering virtual views.

Naturally, surround view also needs software, which is where the QNX OS for Safety comes in. The OS can play several roles in surround-view systems, such as handling camera input, hosting device drivers for camera panning and control, and rendering the processed video onto the display screen, using QNX Software Systems’ high-performance Screen windowing system. The QNX OS for Safety complies with the ISO 26262 automotive functional safety standard and has a proven history in safety-critical systems, making it ideally suited for collision warning, surround view, and a variety of other ADAS applications.

Okay, enough from me. Let’s look at a video, hosted by TI’s Gaurav Agarwal, to see how the TDAx product line can support surround-view applications:



For more information on the TDAx product line, visit the TI website; for more on the QNX OS for Safety, visit the QNX website.

The demo is in the details

$
0
0
A new video of the 2015 QNX technology concept car reveals some thoughtful touches.

Paul Leroux
QNX technology concept cars serve a variety of purposes. They demonstrate, for example, how the flexibility of QNX technology can help automakers deliver unique user experiences. They also serve as vehicles — pun fully intended — for showcasing our vision of connected driving. And they explore how thoughtful integration of new technologies can make driving easier and more enjoyable.

It is this thoughtfulness that impresses me most about the cars. It is also the hardest aspect to convey in words and pictures — nothing beats sitting inside one of the cars and experiencing the nuances first hand.

The minute you get behind the wheel, you realize that our concept team is exploring answers to a multitude of questions. For instance, how do you bring more content into a car, without distracting the driver? How do you take types of information previously distributed across two or more screens and integrate them on a single display? How do you combine information about local speed limits with speedometer readouts to promote better driving? How do you make familiar activities, such as using the car radio, simpler and more intuitive? And how much should a car’s UX rely on the touch gestures that have become commonplace on smartphones and tablets?

Okay, enough from me. To see how our 2015 technology concept car, based on a Maserati Quattroporte, addresses these and other questions, check out this new video with my esteemed colleague Justin Moon. Justin does a great job of highlighting many of the nuances I just alluded to:



In just over a month, QNX will unveil a brand new technology concept vehicle. What kinds of questions will it explore? What kinds of answers will it propose? We can’t say too much yet, but stay tuned to this channel and to our CES 2016 microsite.

Ford ports SmartDeviceLink to QNX CAR Platform

$
0
0
QNX joins Ford, Toyota, and other industry leaders to help drive new standard for app integration.

Paul Leroux
For as long as I can remember, QNX Software Systems has been at the forefront of integrating cars and smartphones. Through our flexible OS architecture and large automotive ecosystem, we provide automakers and Tier 1 suppliers with the ultimate choice in connectivity options for smartphones and other smart devices. And now, QNX customers will have even greater choice, with the availability of Ford’s SmartDeviceLink (SDL) technology for the QNX CAR Platform for Infotainment.

If you’ve never heard of SDL, it’s the open source version of Ford AppLink, the software that allows Ford SYNC users to access smartphone apps through voice commands and dashboard controls. Ford donated AppLink to the open source community to create a standard way for consumers to interact with smartphone apps, regardless of which phone they use or vehicle they drive.

SDL is quickly gaining industry advocates, including Toyota, UI Evolution, and, of course QNX. What’s more, companies like PSA, Honda, Subaru, Mazda are evaluating it for use in next-generation vehicles.

Why the interest in SDL? Because it’s a flexible, vendor-neutral standard that can benefit drivers, automakers, and developers alike. With SDL:

  • Drivers can interact with apps by using voice commands, steering-wheel buttons, and other in-car controls, so they can keep their eyes on the road and hands on the wheel.
  • Automakers can deliver a consistent app experience across vehicles, while retaining the flexibility to customize that experience for each vehicle brand or model.
  • Developers can create apps that can work across multiple smart devices and multiple automotive brands — which means they have greater incentive to create automotive apps.

SDL for QNX builds on a history of successful collaborations between Ford and QNX, including the QNX-powered Ford SYNC 3 infotainment system. According to Paul Elsila, CEO of Livio, the Ford subsidiary that maintains the SDL open source project, “With its large market share, QNX can play a key role in driving the adoption of auto industry standards, and we are excited to work with them in building vendor-neutral technology that can simplify the integration of smartphone apps in any brand or type of vehicle.”

SDL works with multiple smartphone platforms. Moreover, it is highly flexible: it can work across a full range of vehicles, from entry-level to premium, and across a wide range of displays. It can even be used in systems without displays — for instance, in systems that use a voice interface.

To learn more about SDL, check out the announcements that Ford, Toyota, and QNX issued this morning.

QNX announces new platforms for automated driving systems and in-car acoustics

$
0
0
Paul Leroux
Every year, at CES, QNX Software Systems showcases its immense range of solutions for infotainment systems, digital instrument clusters, telematics systems, advanced driving assistance systems (ADAS), and in-car acoustics. This year is no different. Well, actually… let me take that back. Because this year, we are also announcing two new and very important software platforms: one that can speed the development of automated driving systems, and one that can transform how acoustics applications are implemented in the car.

QNX Platform for ADAS
The automotive industry is at an inflection point, with autonomous and semiautonomous vehicles moving from theory to reality. The new QNX Platform for ADAS is designed to help drive this industry transformation. Based on our deep automotive experience and 30-year history in safety-critical systems, the platform can help automotive companies reduce the time and effort of building a full range of ADAS and automated driving applications:
  • from informational ADAS systems that provide a multi-camera, 360° surround view of the vehicle…
  • to sensor fusion systems that combine data from multiple sources such as cameras and radar…
  • to advanced high-performance systems that make control decisions in fully autonomous vehicles



Highlights of the platform include:
  • The QNX OS for Safety, a highly reliable OS pre-certified at all of the automotive safety integrity levels needed for automated driving systems.
  • An OS architecture that can simplify the integration of new sensor technologies and purpose-built ADAS processors.
  • Frameworks and reference implementations to speed the development of multi-camera vision systems and V2X applications (vehicle-to-vehicle and vehicle-to-infrastructure communications).
  • Pre-integrated partner technologies, including systems-on-chip (SoCs), vision algorithms, and V2X modules, to enable faster time-to-market for customers.

This week, at CES 2016, QNX will present several ADAS and V2X demonstrations, including:
  • Demos that show how QNX-based ADAS systems can perform realtime analysis of complex traffic scenarios to enhance driver awareness or enable various levels of automated driving.
  • QNX-based V2X technology that allows cars to “talk” to each other and to traffic infrastructure (e.g. traffic lights) to prevent collisions and improve traffic flow.

To learn more, check out the ADAS platform press release, as well as the press release that provides a full overview of our many CES demos— including, of course, the latest QNX technology concept vehicle!

QNX Acoustics Management Platform
It’s a lesser-known fact, but QNX is a leader in automotive acoustics — its software for handsfree voice communications has shipped in over 40 million automotive systems worldwide. This week, QNX is demonstrating once again why it is a leader in this space, with a new, holistic approach to managing acoustics in the car, the QNX Acoustics Management Platform (AMP):

  • Enables automakers to enhance the audio and acoustic experience for drivers and passengers, while reducing system costs and complexity.
  • Replaces the traditional piecemeal approach to in-car acoustics with a unified model: automakers can now manage all aspects of in-car acoustics efficiently and holistically, for easier integration and tuning, and for faster time-to-production.
  • Reduces hardware costs with a new, low-latency audio architecture that eliminates the need for dedicated digital signal processors or specialized external hardware.
  • Integrates a full suite of acoustics modules, including QNX Acoustics for Voice (for handsfree systems), QNX Acoustics for Engine Sound Enhancement, and the brand new QNX In-Car Communication (ICC).

For anyone who has struggled to hold a conversation in a car at highway speeds, QNX ICC enhances the voice of the driver and relays it to loudspeakers in the back of the vehicle. Instead of shouting or having to turn around to be heard, the driver can talk normally while keeping his or her eyes on the road. QNX will demonstrate ICC this week at CES, in its latest technology concept car, based on a Toyota Highlander.

Read the press release to learn more about QNX AMP.



The simpler, the better: a first look at the new QNX technology concept vehicle

$
0
0
Bringing the KISS principle to the dashboard.

Paul Leroux
“From sensors to smartphones, the car is experiencing a massive influx of new technologies, and automakers must blend these in a way that is simple, helpful, and non-distracting.” That statement comes from a press release we issued a year ago, but it’s as true today as it was then — if not more so. The fact is, the car is undergoing a massive transformation as it becomes more connnected and more automated. And with that transformation comes higher volumes of data and greater system complexity.

But here’s the thing. From the driver’s perspective, this complexity doesn’t matter, nor should it matter. In fact, it can’t matter. Because the driver needs to stay focused on the most important thing: driving. (At least until fully automated driving becomes reality, at which point a nap might be in order!) Consequently, it’s the job of automakers and their suppliers to harness all these technologies in a simple, intuitive way that makes driving easier, safer, and more enjoyable. Specifically, they need to provide the driver with relevant, contextually sensitive information that is easy to consume, without causing distraction.

That is the challenge that the new QNX technology concept vehicle, based on a Toyota Highlander, sets out to explore.

So what are we waiting for? Let’s take a look! (And remember, you can click on any image to magnify it.)

The oh-so-glossy exterior
As with any QNX technology concept vehicle, it’s what’s inside that counts. But to signal that this is no ordinary Highlander, we gave the exterior a luxurious, brushed-metal finish that just screams to have its picture taken. So we obliged:



The integrated display that keeps you focused
When modifying the Highlander, simplicity was the watchword. So instead of equipping the vehicle with both a digital instrument cluster and a head unit, we created a “glass cockpit” that combines the functions of both systems, along with ADAS safety alerts, into one seamless display. Everything is presented directly in front of the driver, where it is easiest to see.

For instance, in the following scenario, the cockpit allows the driver to see several pieces of important information at a glance: a forward-collision warning, an alert that the car is exceeding the local speed limit by 12 mph, and turn-by-turn navigation:



Mind you, the cockpit can display much more information than you see here, including a tachometer, album art, incoming phone calls, and the current radio station. But to keep distraction to a minimum, it displays only the information that the driver currently requires, and no more. Because simplicity.

To further minimize distraction, the cockpit uses voice as the primary way to control the user interface, including control of media, navigation, and phone connectivity. As a result, drivers can access infotainment content while keeping their hands on the wheel and eyes on the road.

Thoughtful touches abound. For instance, the HERE Auto navigation software running in the cockpit interfaces with a HERE Auto Companion App running on a BlackBerry PRIV smartphone. So when the driver steps into the vehicle, navigation route information from the smartphone is transferred automatically to the vehicle, providing a continuous user experience. How cool is that?

Here’s a slightly different view of the cockpit, showing how it can display a photo of your destination — just the thing when you are driving to a location for the first time and would like visual confirmation of what it looks like:



Before I forget, here are some additional tech specs: the cockpit is built on the QNX CAR Platform for Infotainment, uses an interface based on Qt 5.5, integrates iHeartRadio, and runs on a Renesas R-Car H2 system-on-chip.

The acoustics feature that keeps you from shouting
The glass cockpit does a great job of keeping your eyes focused straight ahead. But what’s the use of that if, as a driver, you have to turn your head every time you want to speak to someone in the back seat? If you’ve ever struggled to hold a conversation in a car at highway speeds, especially in a larger vehicle, you know what I’m talking about.

QNX acoustics to the rescue! Earlier today, QNX Software Systems announced the QNX Acoustics Management Platform, a new solution that replaces the traditional piecemeal approach to in-car acoustics with a holistic model that enables faster-time-to-production and lower system costs. The platform comes with several innovative features, including QNX In-Car Communication (ICC) technology, which enhances the voice of the driver and relays it to infotainment loudspeakers in the rear of the car.

Long story short: instead of shouting or having to turn around to be heard, the driver can talk normally while keeping his or her eyes on the road. QNX ICC dynamically adapts to noise conditions and adds enhancement only when needed. Better yet, it allows automakers to leverage their existing handsfree telephony microphones and infotainment loudspeakers.



The reference vehicle that keeps evolving
Before you go, I also want to share some updates to the QNX reference vehicle, which is based on a Jeep Wrangler. Like the Highlander, the Jeep got a slick new exterior for CES 2016:



Since 2012, the Jeep has been our go-to vehicle for showcasing the latest capabilities of the QNX CAR Platform for Infotainment. But for over a year now, it has done double-duty as a concept vehicle, showing how QNX technology can help developers build next-generation instrument clusters and ADAS solutions.

Take, for example, the Jeep’s new instrument cluster, which makes its debut this week at CES. In addition to providing all the information that you’d expect, such as speed and RPM, it displays crosswalk notifications, forward collision warnings, speed limit warnings, and turn-by-turn navigation:



The QNX reference vehicle also includes a full-featured head unit that demonstrates the latest out-of-the-box capabilities of the QNX CAR Platform for Infotainment. For example, in this image, the head unit is displaying HERE Auto navigation:



Other features of the platform include:
  • A voice interface that uses natural language processing, making it easy to launch applications, play music, select radio stations, control volume, use the navigation system, and perform a variety of other tasks.
  • A new, easy-to-navigate UI based on Qt 5.5 that supports a variety of touch gestures, including tap, swipe, pinch, and zoom.
  • QNX acoustics technology that enables clear, easy-to-understand hands-free calls through advanced echo cancellation and noise reduction.
  • Cellular connectivity provided by the QNX Wireless Framework, which simplifies system design by managing the complexities of modem control on behalf of applications.
  • Flexible support for a variety of smartphone integration protocols.

Additional tech specs: The Jeep’s cluster runs on a Qualcomm Snapdragon 602A processor and its user interface was designed by our partner Rightware, using the Rightware Kanzi tool. The head unit, meanwhile, runs on an Intel Atom E3827 processor.

ADAS, augmented reality, V2X, IoT, and more
I have only scratched the surface of what BlackBerry and QNX Software Systems are demonstrating this week at CES 2016. There’s much more to see and experience, including a very cool V2X demonstration, IoT solutions for the automotive and transportation industries, as well as ADAS and augmented reality systems that integrate with the digital clusters described in this post. To learn more, read the press release that QNX issued today and stay tuned to this channel.



Video: Paving the way to an autonomous future

$
0
0
Lynn Gayowski
Lynn Gayowski
CES 2016 is now underway, and our kickoff to the year wouldn’t be complete without a behind-the-scenes look at the making of our new technology concept vehicle and updated reference vehicle.

The video below follows the journey of building our vehicles for CES 2016 and highlights the technologies we’re using to speed progress towards automated driving — and the list of tech that QNX covers is impressive! It includes advanced driver assistance systems (ADAS), V2X, and augmented reality, not to mention digital instrument clusters, in-car communication, and infotainment:



QNX Software Systems continues to innovate in automotive, with a vision for the evolution of automated driving and a trusted foundation for building reliable, adaptable systems. At risk of giving away the big finale, I think John Wall, head of QNX, sums up perfectly what QNX is on target for in the automotive industry: “We will dominate the cockpit of the car.” It’s a bold statement but we’re already amassing some imposing stats that back this up:

In the zone — a visit to the QNX concept garage

$
0
0
Guest post by QNX consultant and software designer Rob Krten.

How often have you heard the expression, “If it were easy to do, everyone would do it”? I’m constantly amazed at the things that QNX does with their concept cars. To me, a car is an inviolate object that must be touched only by the dealer (well, ok, I do top up the windshield wiper fluid and I once changed a battery). I don’t say that because I necessarily like to give the dealer money, but I just don’t want to break anything that’ll cost me more to get fixed properly later.

Pushing the envelope, however, means getting right in there and doing stuff. QNX engineers have done this for their technology concept cars — from replacing the mirrors with LCD screens, to getting right into the dash and rebuilding it, to adding cameras into the antenna fin on the roof. It’s nothing for them to rip out the center console and then look at all the wiring and go, “Huh, ok — so we need to lengthen this wire, add a shim here, move this piece,” and so on. They are fearless.

Redoing the dash of the QNX
reference vehicle.
Sometimes the “getting right in there” is physical; other times, it’s software based — such as making a new application that lives in the infotainment stack or that interfaces with a smartphone. Like a “Dude, where’s my car?” feature — when your Bluetooth phone unpairs with your car, the phone records the current GPS position. Later, when you’re looking for your car, your phone can recall this last stored GPS position — this must be where you left your car. Or even simple aids, such as a radio tuner that detects when you are losing an AM/FM signal and automatically switches to the corresponding digital station, so you can continue listening to your favorite station anywhere you drive.

Curious to see what the future holds, and to actually see some of this work in action, I invited myself down to the “garage” at QNX headquarters. It’s at the far end of the building, next to the cafeteria. The hallway is festooned with posters of previous QNX concept vehicles, highlighting success stories in 3-foot-high glory.

The day I visited, there were half a dozen people in the garage, and two vehicles: a Jeep and a Highlander (otherwise known as the QNX reference vehicle and QNX technology concept vehicle). The garage is a combination of software development lab, hardware development lab, simulation environment, and actual garage (but without the greasy/oily smell). I wanted to get a sense of what drives these people, what they do, and how they do it.

Digital analogs
No, not that kind of digital 
display. Credit: Peter Halasz
The first thing I learned was that there are no real limits. They have the freedom to innovate, without preconceived notions about how things should look. For example, a lead designer on the team (let’s call him Allan, because that’s his name), explained how they look at the controls in the car’s dash display area. In the era of analog, the speedometer had a certain look — it was usually a needle rotating about a central point, where the needle pointed to the speed you were going. In the very early era of digitization, car manufacturers changed this needle to a seven-segment numerical display.

Of course, this was a failure, because the human brain is basically analog; it likes to see nice, continuous changes for processes that are continuous — such as the speed that you’re going. Seven-segment digits change too “randomly”; they require higher-level cognitive functions to parse what the individual lights mean and convert that into digits, and then convert that into a “speed” (and then convert that into “too slow,” or “just right,” or “too fast,” and then, finally, convert that into “apply brake” or “press down on throttle”).

Allan pointed out that changing to a digital display didn’t necessarily mean that they have to slavishly follow the analog “physical” appearance (except do it on an LCD display), but that they were free to experiment with “fill concepts” — digitally controlled analogs to the actual controls. We likened it to the displays in military avionics, where the most important information becomes bigger as it increases in importance. Consider a fighter jet at 20,000 feet — the altitude isn’t nearly as important as it as at 300 feet. Therefore, at 20,000 feet, the part showing the altitude is small, and in a less prominent position than it is when the plane is at 300 feet. The same thing with your speedometer: if you’re doing the speed limit, it’s not as important to show your current speed (you’re most likely flowing with traffic) as it is when you’re 20 over (or under).

In this image from the new QNX technology concept vehicle, the digital instrument cluster is warning that a
forward collision is imminent, and that the driver is exceeding the speed limit by 12 mph. 

You could do the same thing with your fuel range — when you have a full tank, the indicator can be off in a corner somewhere. But as you start to run low, the indicator can get bigger or more prominent, to start nagging you to refuel. By having the displays all be “virtual” on a large LCD screen in the dash, the designers have incredible flexibility to create systems that present relevant information when required, and have it move out of the way when something more important comes along. (Come to think of it, this would be an awesome feature to have on turn-signal indicators — after you’ve kept your blinker on for more than 10 seconds, it would start to get bigger and brighter. Maybe then people would stop driving with their turn indicator permanently on.)

Collision avoided: The V2X command center
Also in the lab was a huge (3 by 5 foot) flat-panel touchscreen, mounted at an angle that’s aggressively unfriendly to coffee cups (probably for that very reason). It’s reminiscent of Star Trek’s main transporter control station, but it’s used to control and display the simulation environment’s V2V (vehicle to vehicle) and V2I (vehicle to infrastructure) data. It acts as a command center to control and reveal the innards of what’s going on in the simulation environment:



When I was there, we ran a vehicle collision avoidance scenario. Two vehicles (the Jeep and the Highlander, of course — they’re tied in to the system) were heading on a collision course (one was southbound and one was eastbound in a grid-style road system). Because they have V2V capabilities, both cars were aware of their impending doom. This showed up nicely on the V2V command center control panel — two cars heading towards each other, little red circles emanating from them indicating the realtime V2V “pings.” Of course, in plenty of time, the Jeep slowed down to avoid the collision (the actual brake lights even went on!). The speed, GPS coordinates, direction, and even what gear each vehicle was in were all shown on the master console. Towards the end of my visit I almost had Allan convinced to do another master control console for the OBDII connector so you could interact with all of the information in each car. What can I say? I like front panels. (I’m a reformed PDP-8 collector.)

The V2X command center, which makes its debut this week at CES, provides a bird’s eye view of several V2X traffic scenarios. In this example, V2X allows a vehicle (the Jeep) to detect that a vehicle up ahead (the Highlander) has braked suddenly, giving the Jeep plenty of time to slow down.

The engineers in the concept garage are “in the zone.” They’re working in an environment that encourages innovation. Watch and see what they produce:




About Rob
Rob is president of Iron Krten Consulting, which provides technical leadership services, from software leadership consulting through to security and embedded software products, development, training and contract services. Rob is also engaged by QNX Software Systems to write marketing and technical documentation. Visit Rob's website.

Why is software the key to bringing augmented reality to cars?

$
0
0
Guest post by Alex Leonov, marketing director, Luxoft Automotive.

While self-driving vehicles are gradually becoming a reality, more and more of today’s cars roll out from factories featuring advanced driver assistance systems (ADAS). We are quickly getting used to adaptive cruise control, blind spot monitoring, parking assistance, lane departure warning, and many other features that make driving safer and the driver’s job easier. Data from cameras, sensors, and V2X infrastructure feed into ADAS systems, increasing their accuracy and efficiency. These systems are important steps toward fully autonomous driving, but the ultimate responsibility for decision making still lies with a driver.

The more that cars become connected, the more the average driver can be bombarded by information while driving. “In 500 feet make a right turn.” “You have an incoming call from Christine.” “You have a new message on Facebook.” “You are over the speed limit.” This may not be so big of a distraction under normal conditions. But sometimes, when driving in hectic city traffic or in a snow storm, it is critical to keep eyes on the road, while still receiving essential information. The good news is, the technology is already there to remedy this.

Heads up for HUDs
Keeping the driver’s eyes on the road is a priority, and head-up displays (HUDs) can accomplish just that. They project alerts and navigation prompts right on the windshield. Analysts predict an explosive growth of HUDs with the market reaching close to US$100 billion by 2020. The bulk of HUDs are relatively simple combiners, but more advances in wide-field-of-view HUDs are coming soon.

Projecting alerts and navigation prompts directly on the windshield.
HUDs are perfect for presenting information in a convenient, natural way, and giving the driver a feeling of being in control. But HUDs are only as good as the information they display. That is why it is critical to have solid and reliable data processing and decision-making algorithms, running on a reliable OS, that can prioritize and filter data. The resulting alerts and prompts must be communicated to a driver in a clear, transparent way.

Computer vision, also known as machine vision, is a key to processing the endless flow of data. With its human-like image recognition ability, computer vision processes road scenes, and the system fuses data from multiple sources. Add in a natural representation of processing outcomes in the form of augmented reality, while tracking driver’s pupils, and you have a completely new level of driver’s experience — safe and intuitive.

Next-generation driving experience
At Luxoft, we’ve been working on making this experience a reality. The result is CVNAR, a computer vision and augmented reality solution. CVNAR is a powerful software framework containing mathematical algorithms that process a vast amount of road data in real time to generate intuitive prompts and alerts. CVNAR has built-in algorithms for road and pedestrian detection, vehicle recognition and tracking, lane detection, facade recognition and texture extraction, road sign recognition, and parking space search. It performs relative and absolute positioning and easily integrates with navigation, the map database, sensors, and other data sources. A unique feature of CVNAR is its extrapolation engine for latency avoidance.

Detecting and recognizing road signs, pedestrians, traffic lanes, gas stations, and other objects.
CVNAR works perfectly with LCD displays and smartglasses, but it is ultimately built for HUDs. Data from cameras, sensors, CAN, and navigation maps are fused and processed to create an extendable metadata output that describes all augmented objects. It takes a HUD and an eye-tracking camera to implement CVNAR in a vehicle. CVNAR will track the driver’s gaze and adjust the position of the augmented objects in the driver’s line of sight to make sure they don’t obstruct anything important — all in real time.

Alerting the driver to an empty parking spot.
This is not all that CVNAR can do. New car models come packed with infotainment features that take time to learn and memorize. The CVNAR-based smartphone app can help. It turns your smartphone into an interactive guide. Point your phone camera to your dashboard and use augmented prompts to find out more about a particular car function. It can work under the hood, too.

Era of a software-defined car
A modern car runs on code as much as it runs on gasoline (or a battery-powered electric motor). Today, it takes over 100 million lines of software code to get a premium car going, and the amount of software necessary keeps expanding. At Luxoft, we are excited about the car’s digital future, and we work every day to help bring it about, by developing cutting-edge automotive solutions for leading global vehicle manufacturers.

Offering a wide range of embedded software development and integration services for in-vehicle infotainment and telematics systems, digital instrument clusters, and head-up displays, Luxoft has developed User Experience (UX) and Human Machine Interface (HMI) technology for millions of vehicles on the road today. We push the envelope of technology in such areas as situation-aware HMI, computer vision and augmented reality, while Luxoft’s products, the Populus and Teora UX and HMI design tool chains, power the development of award-winning automotive HMIs and slash time to market.

Software holds the key to the future of cars. It is essential to creating a customized user experience in vehicles. With over-the-air updates, software offers unmatched flexibility and scalability. Finally, it takes safety to the next level with its ability to simulate human-like logic through complex algorithms.

You can view Luxoft’s CVNAR solution running on a QNX-based ADAS demo this week at CES, in the BlackBerry booth: LVCC North Hall, #325.



About Alex
Alex Leonov has been in the automotive and IT industry for over 18 years in various business development and marketing roles. Currently, Alex leads the global marketing efforts of Luxoft Automotive.

“I don’t know where I’m going from here, but I promise it won’t be boring”

$
0
0
Patryk Fournier
The quote is from the now late but great David Bowie and is extremely prophetic when you apply it to autonomous driving. Autonomous driving is very much still uncharted territory. Investments in roadway infrastructures are being made, consumer acceptance is trending positive, and, judging by the news and excitement from CES 2016, the future if anything will not be boring.

CES 2016 stretched into the weekend this year and ICYMI there was a lot of compelling media coverage of QNX and BlackBerry. Here’s a roundup of the most interesting coverage from the weekend:

ARS Technica: QNX demos new acoustic and ADAS technologies
The crew from ARSTechnica filmed a terrific demonstration of the QNX Acoustics Management Platform and the QNX Platform for ADAS. The demonstration highlights the power and versatility of the acoustics platform, including the QNX In-Car Communication module, which allows the driver to effortlessly speak to passengers in the back of the vehicle, over the roar of an engine revving at high speed. The demonstration also showcases how the QNX OS can support augmented reality and heads-up displays:


Huffington Post: CES 2016 Proves The Future Of Driverless Cars Is Promising
Huffington Post highlighted BlackBerry and QNX as key newsmakers for advancements in driverless cars. The article notes QNX’s automotive leadership: “The software is actually installed in 50 per cent of the world’s automotive infotainment systems including Audi, Volkswagen, Ford, GM and Chrysler.”

Crackberry: Inside the QNX Toyota Highlander at CES 2016
The folks at CrackBerry filmed a demonstration of our latest technology concept vehicle, based on a Toyota Highlander. The demo focuses on the QNX In-Car Communication acoustics module, which forms part of the recently launched QNX Acoustics Management Platform:



HERE 360: QNX and HERE bring to life a multi-screen experience in vehicles
A blog post from our ecosystem partner mentions HERE navigation and its use in the Toyota Highlander and Jeep Wrangler technology concept vehicles.

Award season

$
0
0
Patryk Fournier
The first two months of the calendar year are often referred to as award season by the entertainment industry. Although we don’t compete with the likes of Leonardo DiCaprio and Jennifer Lawrence, we still feel honored with the recent accolades and awards being bestowed upon us.

In the category of Best Backhaul Software or Development Platform for Automakers, the winner is… QNX Software Systems.

Thank you so much to Auto Connected Car News and all the people and companies who voted for us in the Tech CARS Awards. We pride ourselves on offering flexible development platforms that enable automakers to deliver unique, branded experiences. Working with leading-edge automakers and Tier 1 suppliers drives us (pardon the pun) to continue upping our game in advanced platforms for infotainment, digital instrument clusters, advanced driver assistance systems (ADAS), and acoustics — including, of course, the recently announced QNX Platform for ADAS and QNX Acoustics Management Platform.

We would also like to congratulate our fellow award winners, Ford and Harman. Ford won for Overall Best Car Infotainment Software by Automaker for their QNX-powered SYNC 3 connectivity system.

And speaking of Ford, the GSMA Global Mobile Awards recently announced their shortlist of finalists. And we just happen to be a finalist in the category of Best Mobile Innovation for Automotive for our work in Ford SYNC 3.

QNX-powered Ford SYNC 3: Shortlisted for a 2016 Glomo Award.Source: Ford
The Global Mobile Awards, newly rebranded as the Glomo Awards, will take place on February 23 at the Mobile World Congress event in Barcelona, Spain.

Viewing all 233 articles
Browse latest View live