<img height="1" width="1" style="display:none;" alt="" src="https://px.ads.linkedin.com/collect/?pid=1448210&amp;fmt=gif">
Free Research
ABI Research Blog (103)

Smart and EMV Payment Card Market

Sep 13, 2011 12:00:00 AM / by Admin

0 Comments

Smart cards are flourishing within the payment cards market and are set to make a major impact in the market over the next five years with the increasing adoption of EMV within the industry. Penetration of smart and EMV payment cards is increase year on year and this trend is set to continue for the foreseeable future.

So why are so many countries and regions moving their payment card credentials to incorporate a smart card or wish to join and adhere to the EMV standard? There is not one defiant answer with a number of factoring drivers for migration:
Creating a widespread system to allow customers have access to banking facilities anywhere in the world
Increased security with greater data protection and encryption on the card itself
Mag-stripe technology outdated and easier to capture/manipulate for fraudulent use
An increasing traveling population having problems using their non EMV cards in EMV compliant countries creating inconvenience to banking customers.
External pressures from neighboring countries adopting EMV results in the transference of liability for non EMV fraudulent payments and a shift of fraud from EMV compliant countries to non-EMV.
Added value products – multiple applications on one card
Contactless adoption
Another step in creating a smart city creating multiple capabilities on one card with integration with travel and other services
At a top-level, the argument for migration is overwhelming; although the level of investment required is also a mitigating factor. Upgrading cards and infrastructure require extremely high levels of capital, but considered by most to out outweigh those losses accrued by fraudulent activity.
Although smart and EMV payment cards are well positioned there still remains a higher volume of mag-stripe credentials worldwide. This is set to change over the next five years as smart and EMV payment cards claim further share over the payment card market. China is one country which will have a significant impact with an estimated 2.3 billion payment cards in circulation, aiming to complete migration to a chip card by 2015. Further analysis and ABI’s views on the payment card market can be found in ABI Research’s recently published insight – World Smart and EMV Payment Card Market 2010 – 2016, providing commentary and findings from our 2011 Payment Card report.

Read More

M&A Continues in 2011 - Broadcom and NetLogic Microsystems

Sep 12, 2011 12:00:00 AM / by Admin

0 Comments

Broadcom announced today that it will acquire network IC supplier NetLogic Microsystems, the $3.5 billion deal is expected to close 1H2012.

This is the latest in a long line of acquisitions that Broadcom has made in the last two years in order to expand its product portfolio and increase its total addressable market (TAM). In 2010 alone it acquired 5 companies: Teknovus Inc. (EPON chipsets and software), Innovision Research & Technology PLC (NFC), Percello Ltd. (femtocell SoCs), Beceem Communications Inc. (4G platform solutions) and Gigle Networks Inc. (home networking SoCs).

This has added valuable products to all three of its main target segments: Home, Infrastructure and Hand.

So how many acquisitions can we expect this year? At the last count it is 3, including NetLogic, Provigent Ltd. (ICs for microwave backhaul) and SC Square Ltd. (security software). Maybe it is time to strengthen its applications processor line?

More comment to follow ……..

Read More

Texas Utility Goes Cellular for Smart Grid Deployment

Sep 9, 2011 12:00:00 AM / by Admin

0 Comments

Traditionally, most utilities in the US have deployed their smart gird projects using either RF (radio frequency) mesh or PLC (power line carrier) connectivity in the NAN (neighborhood area network), connecting clusters of meters to a local data concentrator, for backhaul to the utility’s head-end. TNMP (Texas-New Mexico Power Company), a subsidiary of PNM Resources and the fourth-largest electricity distribution company in the Texas, has chosen to deploy its smart grid using cellular network connectivity services from AT&T and a smart grid management platform from SmartSynch. ABI Research believes that when a complete picture of costs and benefits is formed, there is a strong argument for many utilities to use smart meters directly connected to the cellular network.

TNMP derives a number of benefits from its decision to utilize cellular connectivity for smart grid communications, which can be grouped broadly into two categories, financial and operational:
Financial: TNMP is capital-constrained, as are most utilities. TNMP found that the costs of hiring, training, and organizing the staff needed to plan, deploy, and manage a self-built smart grid communications network outweighed the lower initial equipment costs of a “typical” self-built network utilizing RF mesh or PLC compared to utilizing cellular radios. By utilizing AT&T’s network, the utility can invest capital on enhancement of energy delivery, rather than on communication technology deployment. Certainly, the highly fragmented character of TNMP’s service territory across Texas added to the technical and organizational challenge of deploying an RF mesh or PLC based network, as well.
Operational: TNMP benefits from having AT&T and SmartSynch focus on management of the communications network, leaving the utility free to focus on its core competency of energy transmission and delivery.
Cellular networks benefit from the massive scale and standards-based technology of the mobile services industry. AT&T is responsible for extending and upgrading their network, and the cost of doing so is amortized across the millions of mobile phone users, rather than solely borne by utility operations. TNMP believes it has been negatively impacted by using proprietary protocols in the past, and specifically wanted a modularized, extensible solution that eliminates stranded assets. Over the lifetime of the deployment, the utility feels this will be much more efficient.
In addition, TNMP views cellular network connectivity as inherently more robust and more secure than a self-built network. If the network were to be disrupted by natural disaster, for example, AT&T will take responsibility for quickly restoring connectivity. AT&T has long experience and deep expertise in securing mobile communications and can apply this to TNMP’s smart grid communications. Furthermore, due to the P2P nature of cellular communications, if one smart meter is compromised, the other smart meters in the system remain isolated from the threat; there are no utility-specific data concentrators at risk of being compromised.
Finally, available bandwidth is higher using a P2P cellular network rather than a shared RF mesh network or PLC. While high bandwidth is not a critical need for meter reading operations, TNMP anticipates that bandwidth needs will increase in the future as new applications are deployed on the smart grid. For example, the utility envisions public service announcements related to volatile weather conditions in Texas to be transmitted someday over the smart grid network into the home.
Read More

Horizontal Standards for M2M

Sep 9, 2011 12:00:00 AM / by Admin

0 Comments

Efforts to develop broad, horizontal standards for the M2M market are gaining momentum. The most important activity is occurring within the context of the International Telecommunication Union’s (ITU) Global Standards Collaboration (GSC), which has established the “M2M Standardization Task Force” (MSTF) to coordinate the efforts of individual standards development organizations (SDOs). While most M2M applications are developed today in a highly customized fashion, and vertical-specific industry bodies are busy crafting standards for markets ranging from the smart grid to the auto industry, it will be broad horizontal standards that will be the major impetus to growth in the middle and later years of this decade.

The end result of these efforts is to define a conceptual framework for M2M applications that is vertical industry- and communication technology-agnostic, and to specify a service layer that will enable application developers to create applications that operate transparently across different vertical domains and communication technologies without the developers having to write their own complex custom service layer. This is a key requirement for the M2M industry to move from its current state of applications existing in isolated silos based on vertical market or underlying technology to a truly interconnected “Internet of Things”.
SDOs want to incorporate existing standards into this conceptual framework as much as possible. Rather than re-invent what already exists, the SDOs prefer to identify and fill gaps, and to integrate what already exists into a unified conceptual framework, as described above. This approach recognizes that it is impossible, or at least undesirable, to try to define physical layer technologies, or networking layer protocols, for every current or future potential M2M application. Different vertical applications will optimize for individual cost and functionality requirements, while a standardized service layer will facilitate cross-vertical application development.
ABI Research believes that an initial proposal for such a framework and service layer could be available by early 2012. It would likely take another 18 – 24 months for this initial proposal to be formally published as a standard or set of standards. Therefore, ABI Research doesn’t expect these efforts to start having an impact until the late 2013 to early 2014 timeframe. We do expect that once such standards are in place, they will play an important role in driving overall M2M market development.
The key benefits of horizontal standards will be faster and less costly application development, and more highly functional and secure applications. Similarly to the market benefit of third-party apps running on smart phone platforms, M2M apps developed on horizontal platforms will be able to make easier use of underlying technologies and services. App developers will not have to pull together the entire value chain or have expertise in esoteric skill sets, such as RF engineering. This will dramatically increase the rate of innovation in the industry in addition to creating more cross-linkages between various M2M applications. Think about future plug-in hybrid electric vehicles communicating with the drivers’ homes, the road, various infotainment services, the smart grid, etc. and you start to get a taste of what is coming.
Read More

The Other Google/Motorola Story

Sep 9, 2011 12:00:00 AM / by Admin

0 Comments

Apart from a good article on CNN.com, there has not been much discussion of the ramification for the home automation market of the acquisition of Motorola Mobility by Google. Google had earlier announced its “Android@Home” initiative on May 9, 2011. Likewise, Motorola Mobility had acquired home automation platform specialist 4Home in December 2010. ABI Research believes the acquisition provides a unique opportunity for Google to differentiate itself in the home automation market.
Google finds itself now with two distinct, yet over-lapping home automation platforms. Given that Verizon, and 25 other service providers around the world, have either announced managed home automation solutions based on the 4Home platform, or are in the process of developing offerings based on it, it would be truly unfortunate for Google to shut down the 4Home effort. There’s perhaps more reason to shut down the more recent, and unproven, Android@Home effort. Especially given Google’s previous struggles in the home systems market (remember PowerMeter?) and Google’s apparent strategy of wanting to create a whole new short range wireless standard to go along with the middleware. (Given the existence of ZigBee, Z-Wave, HomePlug, low power Wi-Fi, and others, the intention to launch yet another home automation physical layer standard is almost nonsensical, and points to a fundamental lack of understanding of the home automation market.)
Rather than getting rid of one or the other, Google should differentiate itself by offering a tiered approach to home automation solutions. In short, provide Android@Home as a free platform for developers to use to create basic home monitoring and control systems (apps) using the smart phone as the control hub and host device. This would be particularly targeted toward younger customers, perhaps living in apartments, who could be introduced to home automation functionality in a very simple, cut-priced manner. This approach could even accommodate couples; home automation system vendor Lagotek has demonstrated the feasibility of having control software reside in more than one control point in the home, and there is little reason why both members of a couple couldn’t share control of their home’s automation functionality between their separate smartphones.
Building on this entry-level tier, Google should continue to support the 4Home platform as a key enabler of service provider managed home automation offerings. Customers introduced to home automation as younger singletons are likely to be even more apt to adopt home automation as they mature and their families grow, than if the service provider has to “educate” them from scratch about the need for, and benefits from, home automation. We could even see service providers explicitly offer these tiers to their customers: start with a smartphone app, graduate to a whole home automation system.
Google has a unique opportunity to truly differentiate itself in the home automation market, both by a tiered approach and by leveraging the Android ecosystem to provide mass market education about home automation and funnel potential customers to 4Home-based systems. Will Google be able to take focus on this and take advantage of this opportunity while it manages its competitive strategy in the handset and tablet industries? We’ll see…
For more information on the home automation industry, please see ABI Research’s Home Automation Systems research service.
Read More

Qualcomm gains more video technologies

Sep 7, 2011 12:00:00 AM / by Admin

0 Comments

Today, Qualcomm and IDT announced that Qualcomm had acquired two video related technologies (and design teams) from IDT – the company’s Hollywood Quality Video (HQV) and Frame Rate Conversion (FRC ) groups. These technologies are used to drive video displays of all types – as well as to convert relatively low quality (or bitrate) videos, including SD videos, to High Definition. We have commented before (including in our Set-Top Box SoC report) that Qualcomm’s acquisition of Atheros points them in a direction toward the connected home. This move for Qualcomm makes sense in many ways, including:

Mobile technologies of today (i.e, Apps and Android) point the direction for home video technologies of tomorrow
Atheros HomePlug technologies and Hy-Fi (HomePlug, WiFi and Ethernet) naturally connect video in the home
In developing regions – where wireless infrastructure (including fixed wireless) has surpassed the fiber and wireline infrastructure, many operators will deliver video experiences over wireless platforms. Up-converting technologies, like those from IDT, will be required to give customers engaging and smooth video quality (even if not true HD) over wireless networks.
Therefore, this acquisition gives Qualcomm yet another tool to integrate into SoCs. These technologies make sense in high end feature phones (notably, those with HDMI outputs), but also help amass technologies required to compete in Digital TV and Set-Top Box chips. The only missing link is MoCA; an acquisition of Entropic would be the final signal that Qualcomm is dedicated to video in the home market.

Read More

“So It Begins” – Netflix’s Journey Towards the Pay-TV Precipice

Sep 2, 2011 12:00:00 AM / by Admin

0 Comments

​Despite Netflix’s recent decision to split the hybrid $9.99 plan into two discrete tiers (streaming only and DVD only, each priced at $7.99 per month), which was intended to generate additional revenue as a means to secure additional streaming content, it looks as if the service might lose content from Starz come February 28, 2012 when the current agreement expires. While the cessation of Starz content on the streaming service is not an absolute certainty (different arrangements might be reached between now and Feb 2012), Starz recently ended licensing negotiations with Netflix, having presumably reached an impasse on the value of said company’s content. Netflix’s stock naturally declined following the announcement, but is this just a quick reaction to bad news, or does this portend a more arduous future than the gilded road the company had seemingly been walking on?

Netflix is encountering many of the same roadblocks or hurdles faced by other streaming services before them and while the company’s large installed base insulated the company for a time, the realities of the video market are starting to come to light. Hulu stands as a prime example of the difficult negotiating process between streaming company and content owners when one considers the difficulties Hulu had negotiating for content with its stake holders. Others like Google TV arrived as a lame duck (in terms of content) after content holders largely refused to support the platform. It is clear the content owners are working diligently to maintain the premium image of their product and this commitment will ultimately force companies like Netflix to make a tough decision – either maintain a complementary status to the pay-TV operators or take the plunge and become a pay-TV operator itself.

While this might not reach the level of a “Virtual Multichannel Video Programming Distributor (VMVPD)” it will likely mean pricing similar (or equivalent) to a traditional pay-TV operator. In other words if consumers want the content they will pay for it one way or another – be it through a traditional MVPD or a service like Netflix. While Netflix continues to mention their customer base’s penchant for “older” content (in terms of streams served) it is foolish to assume this is a viable long term strategy.

Netflix has rather brilliantly spun changes to their service as fittingly appropriate to the behaviors of their customer base. The 28 day window delay for new content was brushed off by stating their customers’ proclivity for older content - a large portion of the content sent out to customers was older content. But the reality is many customers who selected newer releases often had to wait for said content to arrive and in the meantime received older movies from their queue instead. In addition new movies (that customers want to see) are not released every month, let alone weekly, so to get the value out of their monthly subscription fee many consumers also added older movies to their queue. To say customers prefer older movies to newer is not entirely accurate.

Now with Starz, Netflix was quick to point out that Starz only accounts for 8% of viewing, down from a peak of 20%, but again this fails to adequately address the real problem. The peak on Starz viewing likely occurred not too long after the initial deal was put in place and to discount 8% of viewing is foolhardy – Netflix does not add new content at a robust rate and the fact that subscribers were still watching Starz content at near 10% of the total volume suggests that consumers still felt it was valuable enough to revisit previously viewed shows/movies from the Starz library. Granted Netflix will likely take the funds earmarked for Starz and put it elsewhere, but securing another pool of Japanese Anime will only cater to a niche segment (yes I’m being facetious to some degree).

In the end Netflix will likely have to raise the monthly subscription rate in order to secure better content and while this could come at the cost of fewer subscribers this will nonetheless be a necessary move if the company wishes to continue as a significant player in the streaming video market. Netflix will have to lose the illusion that the size of their customer base (which could very well start declining) will force the video industry to move at its command or that customers will always be there – otherwise the company stands the risk of becoming the next Blockbuster Video.

Read More

Video enters the Cloud at IBC 2011

Sep 1, 2011 12:00:00 AM / by Admin

0 Comments

IBC 2011 will be a conference about the Cloud. I’ve heard from multiple video vendors that their plans and offerings for the show will focus on cloud delivery of services.

The multiscreen / TV Everywhere world has created a number of new challenges in video processing. For instance, operators need to have the ability to rapidly process large content libraries (VOD libraries are now 40,000 assets and greater) by creating new formats for new device types (i.e., the new penetration of HP TouchPad ;), new codecs (i.e., WebM) or new delivery standards (i.e., MPEG DASH). Smaller operators can’t afford to deploy the same vertically-oriented solutions with high integration costs as large operators can; yet need to offer the same types of services to customers.
Vendors’ solutions to these problems will include offering cloud services, in addition to traditional in-house or managed offerings. In addition, announcements will be made about how to apply service oriented architectures (SOAs) to video – including the unveiling of technical specifications for the Advanced Media Workflow Association’s Framework for Interoperable Media Services (FIMS 1.0). A good explanation of the history and role of FIMS is at available from Broadcast Engineering.

Read More

Will pricing in the stratosphere for HTC's Jetstream tablet mean an early fizzle?

Sep 1, 2011 12:00:00 AM / by Admin

0 Comments

AT&T announced it will begin offering an LTE-enabled HTC media tablet this weekend to its US audience. The device, dubbed "HTC Jetstream", appears competitive in terms of its specifications - a dual-core processor, Android Honeycomb OS, and 10.1" display. Should everyone go out and buy one when it hits AT&T retail shelves this weekend? No, there are a few caveats to consider.

If you're considering this tablet because of its potential for high-speed LTE wireless networking on-the-go, remember that AT&T has yet to launch an LTE service in the US. Perhaps it will start soon, but will it be available where you live and intend to use the device? The media tablet also includes HSPA+ support (what AT&T and T-Mobile USA each call their "4G" networks) in areas where LTE isn't offered, so all is not lost if you choose to be the first to try the Jetstream.

Pricing for the Jetstream is clearly a move to recoup AT&T's investment in the LTE mobile broadband network though a lack of real-world performance messaging and "why HTC" is sorely missing in the operator's promotion. The device, when purchased with a 2-year service contract, comes in at $700. A comparable iPad 2 with 32GB ofstorage and an HSPA (AT&T) or EV-DO rev.A (Verizon Wireless) modem is $729 without any operator subsidy.

What is the right price for a media tablet? The average selling price in 2010 was under $500 in the US for the hardware (not discounting for any subsidies or incentives). We also know that media tablets fly off shelves for $99, though HP in this example was liquidating its webOS-based TouchPad. The sweet spot for media tablet pricing lays somewhere in between.

It sounds like a good device and jumping right to an LTE modem will offer some future-proofing for mobile broadband performance. The device price and thelack of an LTE network to connectnow suggest putting this media tablet on a wish list for the holidays rather than on a credit card today.

Read More

Lessson's to be learnt from Facebook for Mobile App Developers

Aug 24, 2011 12:00:00 AM / by Admin

0 Comments

Recently, Facebook announced that it has more than 750 million users worldwide. With such a staggering number of users, it’s hard to imagine that Facebook had only a little over 100 million users in 2008. Therefore, a study of the machinery that propelled Facebook to become the world’s no.1 social networking site can give good insights for mobile app developers.

Network effect
The network effect exists only when the value of a network increases as the number of people using it increases. For the network effect to hold, getting big quick is important. Facebook got big fast by allowing the platform to be used free and constantly encouraging its users to invite their friends to join their platform through game invites. More users increased the activities exponentially and kept all other users constantly excited and attracted. Imagine Facebook with only a handful of users. It would not be as fun to use as there would be fewer videos and photos as well as fewer people commenting on posts. Another point: having a network effect in place increases switching costs and thwarts competition from getting established in the first place.
Positive feedback
Positive feedback or positive user experience is a must to “win love” from users. This criteria focuses the app developers right back on fundamentals. At its base, an app must operate smoothly. Then, an app must look and feel good. And, finally, an app must be user friendly.
Customer Lock in
In other words, the app must have high switching costs. Facebook locks in its users through the network effect as well as by playing on each user’s fear of losing contact with friends, as well as the set of accumulated memories (funny or heartwarming posts, comments, and photos stored in Facebook, for example).
Increasing returns for the users
Users should feel that the app becomes more and more valuable over time. Essentially, users should discover more and more of the applications in their daily lives. Initially, Facebook was a platform where users could interact and have fun, but now, it has evolved into a personal assistant where a user can organize events and be reminded of events to attend.

Read More

Lists by Topic

see all

Posts by Topic

See all

Recent Posts