<img height="1" width="1" style="display:none;" alt="" src="https://px.ads.linkedin.com/collect/?pid=1448210&amp;fmt=gif">
Free Research
ABI Research Blog | Admin (75)

Admin


Recent Posts

Smart Grid Communications in the UK

Sep 20, 2011 12:00:00 AM / by Admin

0 Comments

The UK government has committed to a cellular-based solution for its smart grid. It announced a tender for 4.59 billion (US $7.5 billion) worth of contracts to build a new wide area network (WAN) for its smart grid. This new commitment will help the UK meet European Union mandates to support national smart metering by 2020.

It is a bold step by the UK to use a wireless network rather than prime power line communication (PLC) standard which are used in most other European countries for smart grid. The UK government is expected to use general packet radio service (GPRS) or global systems for mobile communications (GSM) technologies rather than 3G or LTE, because smart meters do not require huge bandwidth, and 2G is a cost effective and well known technology.

One of the conditions for telecom companies participating in the tender is they will need 100% coverage of the UK. The tender is split up into 3 regions/contracts; Northern Great Britain which includes Scotland, central Great Britain which includes Wales, and Southern England. Each of the three contracts will be split up into services; telephone and data transmission services, IT services, consulting, software development, internet, and support.

There are two very interesting aspects to the UK government’s new communication initiatives for smart grid:

The government has set up a central data and communications company (DCC) to control the access to the data from the deployed smart meters. This takes away the security responsibilities from energy utilities and will help instill greater consumer confidence in data privacy. The US and other European countries have been debating possible solutions to smart grid security and will take particular interest in the UK’s smart grid model.

The choice to use GPRS or GSM, which are 2G technologies. 2G may be very well known and cost effective but how long will GSM and GPRS networks continue to run. Mobile operators are already planning for 4G in 2G allocated frequency spectrum which will seriously reduce frequency capacity. Additionally, the functionality of 2G isn’t great and the DCC may struggle to interact with consumers and provide real time and accurate information.

Nevertheless, the UK government’s investment in smart grid communications is good news for the energy community and no doubt be a positive for telecom companies that are eager to get into the smart grid market.

Read More

Netflix Shuns the Bundle as Operators Embrace It

Sep 19, 2011 12:00:00 AM / by Admin

0 Comments

Today, Netflix CEO Reed Hastings sent out an email to subscribers, posted to its blog , and with a YouTube video announcing that, in addition to the pricing increase, they would be separating their DVD operation into a separate business, Qwikster, led by Andy Rendich (former head of DVD operations). They highlighted, again, the different (business) needs of the two. Reed Hastings said, “Over the long term, DVD and streaming are gonna get more and more different. Streaming has incredible television shows, streaming is instant, streaming is fairly global, streaming has many things that make it different from DVD. And that over time, both DVD and streaming will be much better because they are separate…”

As a business professional, I understand their quandry… it is a lot simpler to fully manage profit & loss statements on a service by service basis. Furthermore, silo’ed operations – such as that of a cable company that uses a separate technology platform for their multiscreen solution compared to their traditional VOD services – can be easier to get up and running faster (no disrespect meant to the many great cloud video platforms being launched). However, the long term costs – in terms of duplicate metadata creation, managing subscriber plans in multiple places, and trying to provide a seamless user experience on multiple systems, decreased the perceived benefits over time.
Most importantly, however, Netflix has failed to see things from the customer perspective. Customers want integration of their entertainment experiences, and want to be able to select content regardless of the source. Today, customers must already navigate a fragmented world of live TV, YouTube for short content and some web TV shows, hulu or broadcaster portals (i.e., fox.com) for recent TV shows. Long tail content is split between Netflix, operator VOD, DVD sales and reruns. Netflix and Qwikster sites won’t be integrated – which will fragment the queues and decrease the value of ratings (something Netflix got right from the beginning). This change goes against the grain of the value of the bundle – with cable providers and telcos moving their attention from triple play bundles (phone, TV and internet) to quad play bundles (adding wireless services). The ‘carrot’ announced to customers of Qwikster is the addition of game rentals (for XBOX 360, Wii and PlayStation 3) to the service as an add-on.
Judging from comments on the blog, and on YouTube, Reed Hasting’s statement at the start of these announcements “I messed up. I owe everyone an explanation” is just the beginning to its US fan base, and possibly to investors. Netflix is a great company. Their move to the fast growing TV markets of Latin America provides great potential (as DirecTV has shown) – although low broadband penetration will provide slow growth potential. In my mind, the DVD service helps increase the value of the streaming service – due to both technological (i.e., low broadband penetration) and content (release window) limitations. Netflix’s failure to recognize the power of the bundle in customer’s minds will continue to be a source of friction.

Read More

The US and EMV

Sep 19, 2011 12:00:00 AM / by Admin

0 Comments

EMV migration in the US has been a talking point for some time. EMV card deployment remains limited in the US, deployed by banks and financial institutions to their tier 1 customers or those that travel abroad frequently.

In a new twist to the US EMV saga, Visa recently announced in August this year that is was going to support EMV migration in the US.
This announcement had a lot of people excited and was labeled by some as the starting point for a full nationwide EMV migration program. This is not the case and is just a support program by Visa to encourage merchants to adopt a dual contact and contactless readers to support EMV payments when presented an EMV payment card.
This program could certainly prove to be the starting point, additional driver or catalyst that pushes EMV migration, but my feeling is that in a country, which has so far avoided migration will continue to do so for some years to come.
The Announcement by Visa is also most certainly been influenced by the impending US NFC (Near Field Communications) market. With NFC starting to come to market and with expectations high, it makes perfect sense to deploy as much EMV infrastructure to support NFC contactless payments as possible.
This is a good strategy by Visa, by offering support they are simply asking the merchants to come to them. By even partially deploying EMV terminals into the US could greatly increase Visa’s position in the EMV market. Visa are looking to take as much market share as possible in the US early on and adopt a market position that will benefit Visa’s future.
For ABI Researches view on how we believe this announcement will affect the US payment cards market please see our Insights - Visa To Support EMV In The US and How Long Can America Avoid EMV?

Read More

Smart and EMV Payment Card Market

Sep 13, 2011 12:00:00 AM / by Admin

0 Comments

Smart cards are flourishing within the payment cards market and are set to make a major impact in the market over the next five years with the increasing adoption of EMV within the industry. Penetration of smart and EMV payment cards is increase year on year and this trend is set to continue for the foreseeable future.

So why are so many countries and regions moving their payment card credentials to incorporate a smart card or wish to join and adhere to the EMV standard? There is not one defiant answer with a number of factoring drivers for migration:
Creating a widespread system to allow customers have access to banking facilities anywhere in the world
Increased security with greater data protection and encryption on the card itself
Mag-stripe technology outdated and easier to capture/manipulate for fraudulent use
An increasing traveling population having problems using their non EMV cards in EMV compliant countries creating inconvenience to banking customers.
External pressures from neighboring countries adopting EMV results in the transference of liability for non EMV fraudulent payments and a shift of fraud from EMV compliant countries to non-EMV.
Added value products – multiple applications on one card
Contactless adoption
Another step in creating a smart city creating multiple capabilities on one card with integration with travel and other services
At a top-level, the argument for migration is overwhelming; although the level of investment required is also a mitigating factor. Upgrading cards and infrastructure require extremely high levels of capital, but considered by most to out outweigh those losses accrued by fraudulent activity.
Although smart and EMV payment cards are well positioned there still remains a higher volume of mag-stripe credentials worldwide. This is set to change over the next five years as smart and EMV payment cards claim further share over the payment card market. China is one country which will have a significant impact with an estimated 2.3 billion payment cards in circulation, aiming to complete migration to a chip card by 2015. Further analysis and ABI’s views on the payment card market can be found in ABI Research’s recently published insight – World Smart and EMV Payment Card Market 2010 – 2016, providing commentary and findings from our 2011 Payment Card report.

Read More

M&A Continues in 2011 - Broadcom and NetLogic Microsystems

Sep 12, 2011 12:00:00 AM / by Admin

0 Comments

Broadcom announced today that it will acquire network IC supplier NetLogic Microsystems, the $3.5 billion deal is expected to close 1H2012.

This is the latest in a long line of acquisitions that Broadcom has made in the last two years in order to expand its product portfolio and increase its total addressable market (TAM). In 2010 alone it acquired 5 companies: Teknovus Inc. (EPON chipsets and software), Innovision Research & Technology PLC (NFC), Percello Ltd. (femtocell SoCs), Beceem Communications Inc. (4G platform solutions) and Gigle Networks Inc. (home networking SoCs).

This has added valuable products to all three of its main target segments: Home, Infrastructure and Hand.

So how many acquisitions can we expect this year? At the last count it is 3, including NetLogic, Provigent Ltd. (ICs for microwave backhaul) and SC Square Ltd. (security software). Maybe it is time to strengthen its applications processor line?

More comment to follow ……..

Read More

Texas Utility Goes Cellular for Smart Grid Deployment

Sep 9, 2011 12:00:00 AM / by Admin

0 Comments

Traditionally, most utilities in the US have deployed their smart gird projects using either RF (radio frequency) mesh or PLC (power line carrier) connectivity in the NAN (neighborhood area network), connecting clusters of meters to a local data concentrator, for backhaul to the utility’s head-end. TNMP (Texas-New Mexico Power Company), a subsidiary of PNM Resources and the fourth-largest electricity distribution company in the Texas, has chosen to deploy its smart grid using cellular network connectivity services from AT&T and a smart grid management platform from SmartSynch. ABI Research believes that when a complete picture of costs and benefits is formed, there is a strong argument for many utilities to use smart meters directly connected to the cellular network.

TNMP derives a number of benefits from its decision to utilize cellular connectivity for smart grid communications, which can be grouped broadly into two categories, financial and operational:
Financial: TNMP is capital-constrained, as are most utilities. TNMP found that the costs of hiring, training, and organizing the staff needed to plan, deploy, and manage a self-built smart grid communications network outweighed the lower initial equipment costs of a “typical” self-built network utilizing RF mesh or PLC compared to utilizing cellular radios. By utilizing AT&T’s network, the utility can invest capital on enhancement of energy delivery, rather than on communication technology deployment. Certainly, the highly fragmented character of TNMP’s service territory across Texas added to the technical and organizational challenge of deploying an RF mesh or PLC based network, as well.
Operational: TNMP benefits from having AT&T and SmartSynch focus on management of the communications network, leaving the utility free to focus on its core competency of energy transmission and delivery.
Cellular networks benefit from the massive scale and standards-based technology of the mobile services industry. AT&T is responsible for extending and upgrading their network, and the cost of doing so is amortized across the millions of mobile phone users, rather than solely borne by utility operations. TNMP believes it has been negatively impacted by using proprietary protocols in the past, and specifically wanted a modularized, extensible solution that eliminates stranded assets. Over the lifetime of the deployment, the utility feels this will be much more efficient.
In addition, TNMP views cellular network connectivity as inherently more robust and more secure than a self-built network. If the network were to be disrupted by natural disaster, for example, AT&T will take responsibility for quickly restoring connectivity. AT&T has long experience and deep expertise in securing mobile communications and can apply this to TNMP’s smart grid communications. Furthermore, due to the P2P nature of cellular communications, if one smart meter is compromised, the other smart meters in the system remain isolated from the threat; there are no utility-specific data concentrators at risk of being compromised.
Finally, available bandwidth is higher using a P2P cellular network rather than a shared RF mesh network or PLC. While high bandwidth is not a critical need for meter reading operations, TNMP anticipates that bandwidth needs will increase in the future as new applications are deployed on the smart grid. For example, the utility envisions public service announcements related to volatile weather conditions in Texas to be transmitted someday over the smart grid network into the home.
Read More

Horizontal Standards for M2M

Sep 9, 2011 12:00:00 AM / by Admin

0 Comments

Efforts to develop broad, horizontal standards for the M2M market are gaining momentum. The most important activity is occurring within the context of the International Telecommunication Union’s (ITU) Global Standards Collaboration (GSC), which has established the “M2M Standardization Task Force” (MSTF) to coordinate the efforts of individual standards development organizations (SDOs). While most M2M applications are developed today in a highly customized fashion, and vertical-specific industry bodies are busy crafting standards for markets ranging from the smart grid to the auto industry, it will be broad horizontal standards that will be the major impetus to growth in the middle and later years of this decade.

The end result of these efforts is to define a conceptual framework for M2M applications that is vertical industry- and communication technology-agnostic, and to specify a service layer that will enable application developers to create applications that operate transparently across different vertical domains and communication technologies without the developers having to write their own complex custom service layer. This is a key requirement for the M2M industry to move from its current state of applications existing in isolated silos based on vertical market or underlying technology to a truly interconnected “Internet of Things”.
SDOs want to incorporate existing standards into this conceptual framework as much as possible. Rather than re-invent what already exists, the SDOs prefer to identify and fill gaps, and to integrate what already exists into a unified conceptual framework, as described above. This approach recognizes that it is impossible, or at least undesirable, to try to define physical layer technologies, or networking layer protocols, for every current or future potential M2M application. Different vertical applications will optimize for individual cost and functionality requirements, while a standardized service layer will facilitate cross-vertical application development.
ABI Research believes that an initial proposal for such a framework and service layer could be available by early 2012. It would likely take another 18 – 24 months for this initial proposal to be formally published as a standard or set of standards. Therefore, ABI Research doesn’t expect these efforts to start having an impact until the late 2013 to early 2014 timeframe. We do expect that once such standards are in place, they will play an important role in driving overall M2M market development.
The key benefits of horizontal standards will be faster and less costly application development, and more highly functional and secure applications. Similarly to the market benefit of third-party apps running on smart phone platforms, M2M apps developed on horizontal platforms will be able to make easier use of underlying technologies and services. App developers will not have to pull together the entire value chain or have expertise in esoteric skill sets, such as RF engineering. This will dramatically increase the rate of innovation in the industry in addition to creating more cross-linkages between various M2M applications. Think about future plug-in hybrid electric vehicles communicating with the drivers’ homes, the road, various infotainment services, the smart grid, etc. and you start to get a taste of what is coming.
Read More

The Other Google/Motorola Story

Sep 9, 2011 12:00:00 AM / by Admin

0 Comments

Apart from a good article on CNN.com, there has not been much discussion of the ramification for the home automation market of the acquisition of Motorola Mobility by Google. Google had earlier announced its “Android@Home” initiative on May 9, 2011. Likewise, Motorola Mobility had acquired home automation platform specialist 4Home in December 2010. ABI Research believes the acquisition provides a unique opportunity for Google to differentiate itself in the home automation market.
Google finds itself now with two distinct, yet over-lapping home automation platforms. Given that Verizon, and 25 other service providers around the world, have either announced managed home automation solutions based on the 4Home platform, or are in the process of developing offerings based on it, it would be truly unfortunate for Google to shut down the 4Home effort. There’s perhaps more reason to shut down the more recent, and unproven, Android@Home effort. Especially given Google’s previous struggles in the home systems market (remember PowerMeter?) and Google’s apparent strategy of wanting to create a whole new short range wireless standard to go along with the middleware. (Given the existence of ZigBee, Z-Wave, HomePlug, low power Wi-Fi, and others, the intention to launch yet another home automation physical layer standard is almost nonsensical, and points to a fundamental lack of understanding of the home automation market.)
Rather than getting rid of one or the other, Google should differentiate itself by offering a tiered approach to home automation solutions. In short, provide Android@Home as a free platform for developers to use to create basic home monitoring and control systems (apps) using the smart phone as the control hub and host device. This would be particularly targeted toward younger customers, perhaps living in apartments, who could be introduced to home automation functionality in a very simple, cut-priced manner. This approach could even accommodate couples; home automation system vendor Lagotek has demonstrated the feasibility of having control software reside in more than one control point in the home, and there is little reason why both members of a couple couldn’t share control of their home’s automation functionality between their separate smartphones.
Building on this entry-level tier, Google should continue to support the 4Home platform as a key enabler of service provider managed home automation offerings. Customers introduced to home automation as younger singletons are likely to be even more apt to adopt home automation as they mature and their families grow, than if the service provider has to “educate” them from scratch about the need for, and benefits from, home automation. We could even see service providers explicitly offer these tiers to their customers: start with a smartphone app, graduate to a whole home automation system.
Google has a unique opportunity to truly differentiate itself in the home automation market, both by a tiered approach and by leveraging the Android ecosystem to provide mass market education about home automation and funnel potential customers to 4Home-based systems. Will Google be able to take focus on this and take advantage of this opportunity while it manages its competitive strategy in the handset and tablet industries? We’ll see…
For more information on the home automation industry, please see ABI Research’s Home Automation Systems research service.
Read More

Qualcomm gains more video technologies

Sep 7, 2011 12:00:00 AM / by Admin

0 Comments

Today, Qualcomm and IDT announced that Qualcomm had acquired two video related technologies (and design teams) from IDT – the company’s Hollywood Quality Video (HQV) and Frame Rate Conversion (FRC ) groups. These technologies are used to drive video displays of all types – as well as to convert relatively low quality (or bitrate) videos, including SD videos, to High Definition. We have commented before (including in our Set-Top Box SoC report) that Qualcomm’s acquisition of Atheros points them in a direction toward the connected home. This move for Qualcomm makes sense in many ways, including:

Mobile technologies of today (i.e, Apps and Android) point the direction for home video technologies of tomorrow
Atheros HomePlug technologies and Hy-Fi (HomePlug, WiFi and Ethernet) naturally connect video in the home
In developing regions – where wireless infrastructure (including fixed wireless) has surpassed the fiber and wireline infrastructure, many operators will deliver video experiences over wireless platforms. Up-converting technologies, like those from IDT, will be required to give customers engaging and smooth video quality (even if not true HD) over wireless networks.
Therefore, this acquisition gives Qualcomm yet another tool to integrate into SoCs. These technologies make sense in high end feature phones (notably, those with HDMI outputs), but also help amass technologies required to compete in Digital TV and Set-Top Box chips. The only missing link is MoCA; an acquisition of Entropic would be the final signal that Qualcomm is dedicated to video in the home market.

Read More

“So It Begins” – Netflix’s Journey Towards the Pay-TV Precipice

Sep 2, 2011 12:00:00 AM / by Admin

0 Comments

​Despite Netflix’s recent decision to split the hybrid $9.99 plan into two discrete tiers (streaming only and DVD only, each priced at $7.99 per month), which was intended to generate additional revenue as a means to secure additional streaming content, it looks as if the service might lose content from Starz come February 28, 2012 when the current agreement expires. While the cessation of Starz content on the streaming service is not an absolute certainty (different arrangements might be reached between now and Feb 2012), Starz recently ended licensing negotiations with Netflix, having presumably reached an impasse on the value of said company’s content. Netflix’s stock naturally declined following the announcement, but is this just a quick reaction to bad news, or does this portend a more arduous future than the gilded road the company had seemingly been walking on?

Netflix is encountering many of the same roadblocks or hurdles faced by other streaming services before them and while the company’s large installed base insulated the company for a time, the realities of the video market are starting to come to light. Hulu stands as a prime example of the difficult negotiating process between streaming company and content owners when one considers the difficulties Hulu had negotiating for content with its stake holders. Others like Google TV arrived as a lame duck (in terms of content) after content holders largely refused to support the platform. It is clear the content owners are working diligently to maintain the premium image of their product and this commitment will ultimately force companies like Netflix to make a tough decision – either maintain a complementary status to the pay-TV operators or take the plunge and become a pay-TV operator itself.

While this might not reach the level of a “Virtual Multichannel Video Programming Distributor (VMVPD)” it will likely mean pricing similar (or equivalent) to a traditional pay-TV operator. In other words if consumers want the content they will pay for it one way or another – be it through a traditional MVPD or a service like Netflix. While Netflix continues to mention their customer base’s penchant for “older” content (in terms of streams served) it is foolish to assume this is a viable long term strategy.

Netflix has rather brilliantly spun changes to their service as fittingly appropriate to the behaviors of their customer base. The 28 day window delay for new content was brushed off by stating their customers’ proclivity for older content - a large portion of the content sent out to customers was older content. But the reality is many customers who selected newer releases often had to wait for said content to arrive and in the meantime received older movies from their queue instead. In addition new movies (that customers want to see) are not released every month, let alone weekly, so to get the value out of their monthly subscription fee many consumers also added older movies to their queue. To say customers prefer older movies to newer is not entirely accurate.

Now with Starz, Netflix was quick to point out that Starz only accounts for 8% of viewing, down from a peak of 20%, but again this fails to adequately address the real problem. The peak on Starz viewing likely occurred not too long after the initial deal was put in place and to discount 8% of viewing is foolhardy – Netflix does not add new content at a robust rate and the fact that subscribers were still watching Starz content at near 10% of the total volume suggests that consumers still felt it was valuable enough to revisit previously viewed shows/movies from the Starz library. Granted Netflix will likely take the funds earmarked for Starz and put it elsewhere, but securing another pool of Japanese Anime will only cater to a niche segment (yes I’m being facetious to some degree).

In the end Netflix will likely have to raise the monthly subscription rate in order to secure better content and while this could come at the cost of fewer subscribers this will nonetheless be a necessary move if the company wishes to continue as a significant player in the streaming video market. Netflix will have to lose the illusion that the size of their customer base (which could very well start declining) will force the video industry to move at its command or that customers will always be there – otherwise the company stands the risk of becoming the next Blockbuster Video.

Read More

Lists by Topic

see all

Posts by Topic

See all

Recent Posts