<img height="1" width="1" style="display:none;" alt="" src="https://px.ads.linkedin.com/collect/?pid=1448210&amp;fmt=gif">
Free Research
ABI Research Blog (115)

Combination chipsets to start ramping up in the new year?

Jan 5, 2011 12:00:00 AM / by Admin

0 Comments

​Atheros has announced various new 802.11n and Bluetooth 4.0 products in various formats. Atheros coming up with acombination solutionis not surprising. They had previous combination solutions on a single board to provide integrated solutions.What is different about this solution is that they are launching a 802.11n and Bluetooth 4.0 combination chipset.

Atheros has been a notable player in thein the short range wireless industry. In the past, they had been mainly focused on stand-alone chipsets. With the announcement of theAR9485​, which is a system in a package which includes 802.11n and Bluetooth 4.0, Atheros plans to move this towards the mobileComputing and Tablet segment of the market.

This could be the markof an inflexion point for the combination chipset market. There is enough interest in combination chipsets currently in the mobile computing and tablet market, not to mention the mobile device and handheld market where real estate on a device is even more scarse, fora company that had previously focused on stand alone chipsets tonow showcasea combination chipset. Keep an eye for a ramp up oncombination chipsets in2011 in various segments of the market.



Read More

Pre CES 2011 - Thinking About the Technology of 2010

Jan 4, 2011 12:00:00 AM / by Admin

0 Comments

As an avid consumer of technology and an analyst in said field, I often find myself on the bleeding edge of the adoption curve (which as a consumer has its ups and downs) and as CES quickly approaches I started to reflect on my interactions with newer technologies over the past year.
3D last year was the hot subject at CES 2010 and despite a relatively lukewarm reception as an early adopter I naturally had to buy one and as a consumer I would say my 3D experience to date has been a mixed bag, as it were. The glasses are not as troublesome as I had previously thought, but at the same time I certainly wouldn’t miss wearing them. Considering we’ll have glasses for the near future, hopefully we’ll see a movement towards universal glasses at CES 2011. What has proven the most difficult so far, however is integrating the TV into my preexisting home theater, which includes a 7.1 channel AVR with HDMI 1.3a support.
Naturally as an analyst I knew HDMI 1.4a was necessary to natively support 3D but at the same time I knew you could implement a 3D pass-through so I thought I would give it a try…no such luck. As far as my Sony PS3 was concerned I didn’t own a 3D TV if I went through the AVR (and no mention of a firmware update from the AVR manufacturer). So in order to support HD audio and 3D my only options were to 1: upgrade my AVR, 2: try and find one of the two Blu-ray players that support dual HDMI out (and 3D ready), or 3: swap cables when I want to watch a 3D movie (by-pass AVR).
I also found my options for content rather limited with only a handful of 3D Blu-ray titles available – not to mention the exclusive titles. The last part in particularly is rather troubling. One of the biggest hurdles to 3D has been limited content and while the exclusive deals might have been lucrative to some parties, it didn’t help nurture a healthier market for 3D overall.
Another big push has been Internet connected devices and OTT content. Yes my TV included an Internet connection (wired/Ethernet out-of-the-box) but as a consumer I haven’t found the experience all that compelling. In total I currently have access to 10 applications – 2 of which are what appear to be news programming clips in German. Others like YouTube are nice but typing in the search words with the alphanumeric keypad is too slow when I have a laptop or mobile phone not too far away. So again, another hope for CES 2011 – better remote controls or input devices (as you can probably guess I didn’t buy a Vizio 3D TV since they include a Bluetooth sliding Qwerty keyboard remote). On a somewhat related note, Microsoft’s Kinect works relatively well as a motion control, for games and navigating the Xbox 360 menus, although ample space is required.
Referring back to OTT content however, the game consoles, be it the Sony PS3 or Xbox 360 Live have offered a more “complete” experience taken together with movies, games, live content, music, and browser. Even better I was afforded the opportunity to trial Plaster Networks powerline adapters to connect my Xbox 360 (since it didn’t come with Wi-Fi built-in…remember I’m an early adopter) and truth be told the set up was painless (simply plug the adapter into the wall and the Ethernet cables) and thus far the experience has been excellent – ESPN 360 on Xbox Live looks great and runs without interruption. As we see more networking options over the coming years if not next year hopefully we can expect similar results – especially if initiatives like IEEE 1905.1, G.hn, or hybrid home network solutions come to fruition and foster a more cohesive and seamless multi-technology/medium network . So it will be interesting to see what networking solutions are on hand at CES this year.
For 3D and Internet connected devices though, if content is king, cost is certainly queen so care to things like interoperable glasses or support for legacy installations will continue to be paramount. Hopefully these are some of the things we will see at CES 2011.
Read More

eBook Reader Shipments (and Rumors) Hit All-Time High in 2010

Dec 30, 2010 12:00:00 AM / by Admin

0 Comments

Vendors like to constrain access to product performance data, especially in the early stages of a market. This keeps competitors and investors at bay allowing manufacturers to build awareness and hype. As expected, credible details on shipments for eBook Reader devices have been sparse.

Vendors and so-called “sources familiar with the situation” recently turned up the volume on how well the mobile CE device category is doing. Whether it was about Amazon, Barnes & Noble, Hanvon, or Sony, the message was similar: “millions” of devices have shipped and significant gains were seen in comparison to 2009.

ABI Research expects more than 11 million eBook Reader devices to ship in 2010 with the majority: destined for the United States, and including built-in 3G wireless connectivity. Contrast that with the fact that less than 4 million devices shipped worldwide in 2009.

Some pundits felt media tablets, such as Apple’s iPad, would obviate the need for dedicated consumer electronics reader devices. This couldn’t be further from the real world. Amazon executives directly tackled the claims by saying that their results don’t reflect the supposed cannibalization, and that consumers are even choosing to acquire both slate-shaped devices.

A fundamental change to reading digital content occurred mid-2009 when the dedicated eReader vendors starting offering reader software applications for PCs, media tablets and smartphones. This shift in strategy signaled that no single platform would garner all the eyes. The uptake on additional “screens” bodes well for digital content bookstores.

The statistics emerging today for the eBook Reader market reiterate what was already understood; just coming now from the vendors themselves. Media tablets have created incremental opportunities for the consumption of the written word. It has been a good year for eReaders, but still a long road ahead to become a mass market opportunity.

Read More

LTE Modems – A 12-Month Retrospective

Dec 23, 2010 12:00:00 AM / by Admin

0 Comments

It has been a year since the first mobile network offering 3GPP’s Long Term Evolution (LTE) mobile broadband service commenced at TeliaSonera in Stockholm, Sweden and Oslo, Norway. Early modem offerings for LTE networks have been rudimentary at best.

Capability to develop and manufacture LTE modems saw some strategic chess moves in 2010. Nokia sold its LTE modem unit in July to Renesas for $200 million, while 4G chipset up-start Beceem got scooped up by Broadcom. France’s Sequans, backed by Alcatel-Lucent and Motorola, continued its LTE chipset development. And Altair announced it has teamed with IPWireless to develop a range of multi-mode LTE modem products.
Samsung was the early USB modem provider for the TeliaSonera networks. The fledgling Verizon Wireless LTE network in the US offers LG’s VL600 and Pantech’s UML290 USB modem models.
The latest news in LTE modems is the availability of Huawei’s tri-mode E398 GSM/UMTS/LTE USB modem stick originally shown off at Mobile World Congress this past February. The Qualcomm MDM9200-powered modem is immediately availability at LTE operators including Mobilkom Austria (Austria) and Net4Mobility (Sweden). Operators in Denmark, Germany and Norway are expected to join the list during 2011.
Earlier this month, Smith Micro Software announced the company’s connection manager software became available with LTE modem support. Connection management solutions are most recognized as the user interface that gets a computer’s modem connected to the mobile network. More recently, connection management provides a one-stop-shop for operators to deploy a variety of modems across a range of network types. The Smith Micro QuickLink Mobile product enabling the most appropriate network connection depending on the location and navigates between different network architectures including Wi-Fi, 3G cellular and 4G mobile broadband.
A lot has transpired in the first year of LTE network availability, while the real opportunity still remains a part of the future. Aftermarket modems (such as the USB examples provided here) are expected to drive early subscriptions to LTE networks. The year 2010 didn’t turn out to be “the year of LTE”, but 2011 is looking ready to welcome access to faster broadband speeds on the go in more places.
Read More

AT&T's Purchase of Qualcomm's 700 MHz Spectrum

Dec 22, 2010 12:00:00 AM / by Admin

0 Comments

Qualcomm sold its 700 MHz spectrum to AT&T for $1.925 billion. This includes 12 MHz of the D and E block spectrum (what was Channel 55 and 56) covering over 70 million people, including in New York, Boston, Philadelphia, Los Angeles, and San Francisco. It also includes 6 MHz of D block spectrum covering over 230 million people across the rest of the United States. In total, the spectrum covers over 300 million people. This sales is expected to close during the second half of 2011.

AT&T paid $6.637 billion for its current 700 MHz spectrum that it acquired in the 2008 auction. This was for 150 licenses of B block spectrum which consists of 12 MHz of spectrum. (Verizon got 22 MHz of spectrum, which is enough to use with one 10 MHz downlink channel and one 10 MHz uplink channel.) This spectrum is only in urban areas. AT&T had gotten less spectrum for 4G than Verizon and covering a smaller geographical area and fewer people. This spectrum will help AT&T to increase its downlink bandwidth using carrier aggregation at a later point.

Overall, this is not a huge deal, but is important for AT&T to compete against Verizon. For the downlink, it puts AT&T more on par with Verizon in rural areas and gives AT&T an advantage over Verizon in urban areas. It will do nothing for AT&T's disadvantage on its uplink spectrum.


Read More

The ITU Continues to Confuse People on 4G

Dec 22, 2010 12:00:00 AM / by Admin

0 Comments

​I want to try and clear something up here, because a lot of people think that the ITU has accepted HSPA+ as a 4G technology. This is wrong on so many levels.

The ITU does not control the term "4G." They have been trying to do this for a long time now, but 4G is not a specification or standard. It means 4th generation - in this case it means 4th generation WWAN air interface technology. To talk about speeds is to not understand what these technologies are. 3G technologies were all based on CDMA primarily. HSPA+ is a 3G technology that is compatible with WCDMA, HSDPA, and HSUPA (or HSPA). WIMAX and LTE are 4G technologies because they are OFMDA-based and they will be compatible with WiMAX2 and LTE-Advanced, respectively. The ITU defines IMT-2000 and IMT-Advanced based on performance requirements. This has nothing to do with the fundamental generation of technology, and has included a mix of technologies. IMT-2000, for example contains 3G and 4G technologies. As long as you understand that IMT-Advanced does not equal 4G, then all this confusion goes away.

The ITU backed away from tying IMT-Advanced and 4G together after stirring a debate and after the IEEE sent them a letter where the IEEE disagreed with the ITU's use of 4G and pointed out that the working group in the ITU specifically recommended that the ITU avoid using the term 4G. Buried in a mixed press release, the ITU acknowledged that other technologies can also be referred to as 4G, but also mentioned advanced 3G technologies. (Saying a 3G technology can be 4G does not make sense though.) So people assumed that now the ITU relaxed its requirements. No! They didn't IMT-Advanced is still IMT-Advanced - only WiMAX2 and LTE-Advanced meet the ITU's requirements for IMT-Advanced. Just because the ITU seemed to say that HSPA+ can be 4G too does not make it so.

4G (which is OFDMA, where 3G is CDMA) is a completely separate thing from the ITU's performance requirements for IMT-Advanced.

Read More

Net Neutrality: No One is Real Happy, and Consumers Are Left to Wonder

Dec 22, 2010 12:00:00 AM / by Admin

0 Comments

The FCC finally issued its Net Neutrality plan – and the wireless industry got a better deal than the wired Internet providers. No one seems too happy. The plan is likely to end up in court, and a battle in Congress is brewing. Consumers are left to wonder what they got in the current stew.

The partisan 3-2 ruling (three Democrats in favor, two Republicans against) means the wired Internet service providers (Verizon, Comcast, AT&T and others) will have less say over how they manage their networks. Wireless carriers get less-stringent rules because, in the view of FCC Chairman Julius Genachowski, they face more congestion issues and need more leeway in managing their traffic.

Read More

Online Mapping and Offboard Navigation going Offline and Onboard

Dec 21, 2010 12:00:00 AM / by Admin

0 Comments

For all the talk about the benefits of cloud-based services, a recent trend has seen many location vendors offering local storage of maps and/or allowing navigation to continue when – temporarily – out of wireless coverage. It seems like the industry has finally come to terms with reality – with the help of their customers – in acknowledging that wireless coverage and connectivity still isn’t as good as it should be. While this is true for regular navigation use, it is even more critical for outdoor navigation applications such as skiing or mountain climbing. In Europe in particular high data roaming tariffs have also held back the adoption of online mapping and offboard navigation.
Some examples of recent product launches and offerings:
Google Maps / Google Maps Navigation – The latest iteration of Google Maps for mobile for Android phones includes an offline feature proactively caching maps of frequently visited areas. In turn Google Maps Navigation now allows offline rerouting when a turn is missed and the connection is lost. However, a connection is still required when starting a new route.
Skobbler’s ForeverMap - The latest release of Germany-based Skobbler’s ForeverMap allows installing maps for individual countries offline. Address search, location finder, route calculation for pedestrians or cars and information on places of interest are also available offline.
Augmentra’s Viewranger - UK-based outdoor navigation vendor Augmentra recently added access to online maps such as OpenStreetMap (Road, cycling, terrain, skiing tracks) and Bing Maps (road and aerial) as well as a local caching feature. It provides an alternative for the very expensive topographical maps.
Fullpower’s MotionX GPS – Fullpower’s successful outdoor and fitness application allows caching road, terrain, and marine maps.

It is interesting to see some vendors provide tools to monitor and manage the on-board storage of maps. Viewranger includes settings allowing specifying the maximum cached data volume and storage duration after which the downloaded data is deleted. Ways to download off-line maps vary widely. Viewranger automatically downloads maps at the right zoom and detail level when hovering over them. While Skobbler allows downloading entire countries, MotionX GPS allows specifying circular areas and corridors. Google has been more vague about how maps are cached locally, stating on their blog that ” Maps will automatically start caching the areas you visit the most when your device is plugged in and connected to Wi-Fi.” It is also important to remember that vendors such as Telmap have offered a limited caching feature for many years.

The major issue with cached maps across a wide range of zoom levels is the volume of the data which can easily reach several Gigabytes for medium-sized cities. While smartphones ship with increasingly larger amounts of built-in memory, Google has preferred to wait with caching until the launch of vectorized maps allowing extensive zoom levels without blowing up the size of the cached maps.

Obviously what we are seeing here is the gradual move towards hybrid solutions combining the unlimited access to always up-to-date on-line maps and services wherever and whenever possible and the ability to cache maps locally when required. This form of side-loading and caching map data over Wi-Fi or wirelessly during off-peak also provides a way to reduce volume of wireless data downloads which is favorable for both end users and carriers which are desperately looking to reduce the loads on their networks. At the same time this constitutes a threat for the traditional on-board map licensing model with further pricing erosion driven by the widespread availability of quickly improving, free and in some cases very detailed OpenStreetMap data.

Read More

Sam Rosen discusses Web TV on NPR

Dec 17, 2010 12:00:00 AM / by Admin

0 Comments

ABI Research analyst Sam Rosen appeared on NPR's On Point show on Thursday, December 16th. The discussion centered on Web TV - the technology and its impact on traditional pay-TV providers​. Listen to the replay here:

http://www.onpointradio.org/2010/12/web-tv

Read More

The ITU Indirectly Admits That Its Use of "4G" Was Wrong

Dec 10, 2010 12:00:00 AM / by Admin

0 Comments

Buried in a multi-purpose press release, the ITU has backed away from its use of 4G.

"Following a detailed evaluation against stringent technical and operational criteria, ITU has determined that “LTE-Advanced” and “WirelessMAN-Advanced” should be accorded the official designation of IMT-Advanced. As the most advanced technologies currently defined for global wireless mobile broadband communications, IMT-Advanced is considered as “4G”, although it is recognized that this term, while undefined, may also be applied to the forerunners of these technologies, LTE and WiMax, and to other evolved 3G technologies providing a substantial level of improvement in performance and capabilities with respect to the initial third generation systems now deployed. The detailed specifications of the IMT-Advanced technologies will be provided in a new ITU-R Recommendation expected in early 2012."

WiMAX (802.16e) and LTE are 4G technologies, of course. But the ITU moved from one mistake to another. It is essentially saying - without specifically naming the technology - that HSPA+ is 4G. But this is wrong. It's a 3G technology. The ITU even acknowledges that it is a 3G technology in the very same sentence. To paraphrase:

"A 3G technology is a 4G technology."

This doesn't make sense, and it is apparent that they ITU has not learned from its first mistake. The ITU should stick to IMT-2000 and IMT-Advanced and stop trying to use the 4G label for political reasons.



Read More

Lists by Topic

see all

Posts by Topic

See all

Recent Posts