Monday, December 21, 2009

WHAT DIGITAL TV REALLY MEANS

In 2006, a global project to migrate from analogue TV to digital TV was initiated during a telecommunications conference in Geneva. A deadline of June 2015 was set by which all broadcasters are expected to have migrated to the digital platform. Kenya has since followed suit in the global trends and has become the third country to commence her migration to digital TV in Africa. Following the presidential inauguration ceremony of digital TV at the Kenya Broadcasting Cooperation (KBC) transmitting station, debate has been brewing in the public domain. Of particular concerned is the migration cost from analogue to digital receptive television sets, by either purchasing a new compatible set or a digital converter. The government says the converter boxes should be priced at between Sh3,000 and Sh5,000 but consumers say they are currently priced at Sh10,000. At present, there are over 4 million household TV sets, most of which are not capable of processing the digital signal. The migration will therefore compel majority of Kenyans dig dipper into their pockets. However, consumers are oblivious of the benefits of digital television and reasons behind the world’s migration trend from the traditional vestigial wideband analogue transmission to the modern narrowband digital transmission.

By definition, Digital television (DTV) is the sending and receiving of moving images and sound by discrete signals. The signals can be transmitted through air, copper or fiber optic medium. There are various methods of receiving the transmitted digital signals. The most common is Digital Terrestrial Television (DTT) which broadcasts land based signals and uses an aerial, same as one for an analogue signal, on the receiving end. This requires Digital Video Broadcasting Terrestrial (DVBT) enabled TV set or an MPEG-4 digital converter. Another method is by use of a digital cable from a cable television company. The signal can be delivered using coaxial cable or fiber optic cable. Digital television can also be received via the Internet Protocol (IP), usually referred to as IPTV, using a Broadband connection to an Internet Service Provider (ISP). The last method that can be used is by using handheld devices such as smart phones, which have been configured to receive the signals through a mobile provider’s network.

Digital television presents a number of opportunities to the Kenyan Information and Communication Technology Industry (ICT) industry. DTV has several advantages over analog TV, the most significant being that digital channels take up less bandwidth, and the bandwidth needs are continuously variable, at a corresponding reduction in image quality depending on the level of compression as well as the resolution of the transmitted image. This means that digital broadcasters can provide more digital channels in the same space, provide high-definition television service, or provide other non-television services such as multimedia or interactivity. DTV also permits special services such as multiplexing (more than one program on the same channel), electronic program guides (EPG) and additional languages (spoken or subtitled). Engineers and software developers could also benefit from installation business and development of software that will help record programmes for later viewing.

The Kenya Broadcasting Cooperation’s (KBC) test runs will focus on the first method due to its infrastructural economics in terms of transmission and receiving equipment, the other methods would call for higher capital expenditure if they were to have country-wide coverage. The service is being operated by Signet, a subsidiary of the Kenyan Broadcasting Corporation (KBC), specifically set up to broadcast and distribute the DTT signals. As the government works on subsidizing or providing incentives for consumers to purchase compliant equipment, University of Nairobi and Jomo Kenyatta University are said to have taken up the challenge to develop locally assembled analogue to digital converters. Digital television signals will not interfere with the analogue signals, and they will coexist with analog television until it is phased out. Currently, the transmission covers Nairobi and its environs, among them Kajiado, Machakos, Naivasha and Murang’a. From these areas, from a digital-enabled television set, your can be able to enjoy good picture quality and a Telezine. A telezine, an acronym for television magazine, is a user-interactive menu from which a viewer can get information from the television station such news update, company profile and so on by simply using the remote control.

The digital TV coverage is expected to gradually spread to the rest of the country to pave way to the complete migration by the year 2012. The complete switch to digital broadcasting is expected to cost Sh6 billion (USD 80 million) and an initial Sh152 million (USD 2 million) has already been allocated. Broadcasters will be required to sign transmission contracts with Signet upon licensing by the CCK. Signet will carry private broadcasters signals free of charge, but will charge for its services after 2012. This means that broadcasters will concentrate on content development as opposed to incurring costs on none core business issues such as building and maintaining infrastructure. However, this model poses some challenges in its deployment. Technically, it introduces a single point of failure for national broadcasting. If the Signet transmission base is down, the entire country could be thrown into a television ‘blackout’. Secondly, a state owned transmission company is prone to political interference especially in transmitting content unfriendly to the government. The Kenya Media Owners Association has already expressed its concerns with the role of Signet. During the inaugural ceremony, the Standard Group vice-chairman, Mr. Paul Melly, said that his concern was that the service provider is KBC, which is owned by the government. He emphasized that the media industry wants to be assured that Signet will play its role properly.

Currently, the demand for new TV and radio broadcasting frequencies surpasses the supply by a huge margin. There are over 60 applications for TV licenses and more than 150 for FM radio. DTV will reduce the bandwidth consumption for TV transmission by up to 10%. This will provide room for additional broadcasters or bandwidth allocation to non-television services. The sale of non-television services may provide an additional revenue source. Telco companies in Kenya can also provide the DTV signal over their infrastructure in additional to the voice and data services. This was typical in the US where the coaxial cable network was used to provide Internet and cable TV services. Internet Service Providers can deliver DTV using broadband connections, increasing revenues from the consumer requirements of additional bandwidth to cater for the IPTV service. As mobile service providers in Kenya continue invent value added services for their subscribers, DTV will feature significantly. Safaricom has already signed an agreement with Nokia and DMTV concerning a Digital Video Broadcasting – Handheld (DVB-H) mobile TV service in the country. The agreement will enable Safaricom subscribers to watch DSTV's menu of TV programmes from certain Nokia mobile phones.

The challenge is for the broadcasters now is to generate adequate local content that will enable them to run the stations 24 hours. This could not have come at a better time, when the Kenyan audience is warming up to local movies, operas, music, documentaries and so on and so forth. The local artists will benefit from the demand of their talents by the broadcasters to develop content. As of late 2009, 10 countries had completed the process of turning off analog terrestrial broadcasting. Many other countries had plans to do so or were in the process of a staged conversion. The first country to make a wholesale switch to digital over-the-air (terrestrial) broadcasting was Luxembourg, in 2006, followed by the Netherlands later in 2006, Finland, Andorra, Sweden, Norway and Switzerland in 2007, Belgium (Flanders) and Germany in 2008, and the United States, Denmark, South Africa and Kenya in 2009.

Tuesday, November 24, 2009

Google Takes Cloud Computing To The Next Level

Cloud computing is a new concept that is quickly gaining popularity in the world, though it is rarely utilized in Kenya. But what is cloud computing? It is an Internet- ("cloud-") based development and use of computer technology ("computing"). In concept, it is a paradigm shift whereby details are abstracted from the users who no longer need knowledge of, expertise in, or control over the technology infrastructure "in the cloud" that supports them. It typically involves the provision of dynamically scalable and often virtualized resources as a service over the Internet.

Main advantage of this concept is the fact that the users don’t need in-depth knowledge of various computing concepts. In a semi-computer-literate country like Kenya, this would go a long way in enhanced the computer usage by the ‘common mwananchi’. The other advantage is that the user does not require high-end resources on his or her computer, but rather the resources in the ‘cloud’ are shared among the users. In particular, the storage resources of Google Docs, for instance, are virtually inexhaustible. Finally, the user can access the storage servers from any part of the world via the Web, talk about unlimited portability.

Google Inc has taken cloud computing to the next level by providing a platform on which exclusively runs, the Google Chrome OS. Google Chrome OS is an open source operating system designed by Google to work exclusively with web applications. Google Chrome OS would require less storage space because the operating system on a disk is 60 times smaller than Windows 7.

"With Chrome, Google is seeking to challenge Microsoft dominance in the market It's basically a web browsing machine," said Altimeter Group analyst Charlene Li, referring to the netbooks powered by Chrome operating system (Source: Nation Daily 24th November, 2009)

The new Google Inc software will start up a computer as fast as a television can be turned on, an average of seven seconds on a netbook. The operating system is based on Linux and targets specifically designed hardware. The user interface takes a minimalist approach, resembling that of the Chrome web browser. Because the browser will be the only application residing on the device, Google Chrome OS is aimed at users who spend most of their computer time on the Internet.

The technology is apt for small to medium scale business with little financial provision to invest in their ICT infrastructure. With a low-end computer such as a netbook, a business can process and store business document either in the office or remotely. This also presents an opportunity for our secondary and primary schools with low budgets for their computer labs. The computers would not require running the applications or having huge storage spaces. However, it demands an investment on the Internet bandwidth, which is becoming cheaper as days pass.

Friday, October 9, 2009

Mobile Telephony – GSM vs CDMA

Back in the 1990s, everyone had to use a landline telephone, fax, telegram or postal letters to communicate in Kenya. It required several days or weeks of planning just to organize for a meeting in Nairobi with a person from upcountry. Tracing a person was not easy, unlike now when a person can be ‘seen’ anytime anywhere using the mobile phone. These features of legacy communication methods adversely affected business operations; procurement processes took too long, landline access was limited to urban areas, collaboration between different branches of the organization remained challenging, response rate to customers was slow, advertising was not personalized and so on. The need for mobility was eminent and by the year 2000, mobile telephony providers had landed in Kenya. The rest of the world was rapidly adapting to this technology and in 2008 there were 4.1 billion mobile cellular subscriptions in the world.

Just to get an insight of how mobile telephony works, mobile phones send and receive radio signals with any number of cell site base stations fitted with microwave antennas. These sites are usually mounted on a tower, pole or building, located throughout populated areas, then connected to a cabled communication network and switching system. The phones have a low-power transceiver that transmits voice and data to the nearest cell sites. The most sophisticated aspect of mobile telephony is the fact that a mobile phone is able to switch seamlessly between sites. As the user moves around the network, the "handoffs" are performed to allow the device to switch sites without interrupting the call.

There are two main standards used in mobile telephony, Global System for Mobile communications (GSM) and Code division multiple access (CDMA). The two standards use different ideology to achieve the same goal: to divide the finite radio frequency spectrum among multiple users. Using an analogy, think of a cocktail party where different want to talk to each other. The first way of doing is have each person allocated a time slot to address the other person, this is how GSM functions. The second way is to have each pair of person talking in a different language, this is how CDMA functions. The first option may necessitate shorter speeches to give each person a chance to talk while second option may prove noisy if the pairs are too loud. Technically, GSM requires high frequencies to offer a high number of timeslots, up to 1.9 Ghz, while CDMA many suffer from interference if high amplitude codes are used at the same channel.

GSM is the more popular than CDMA with the technology adopted in 212 countries throughout the world. It rides on Time Division Multiple Access (TDMA) technology which is used to allow eight full-rate or sixteen half-rate speech channels per radio frequency channel. Newer versions of the standard are backward-compatible with the original GSM phones. For example, Release '97 of the standard added packet data capabilities, by means of General Packet Radio Service (GPRS). Release '99 introduced higher speed data transmission using Enhanced Data Rates for GSM Evolution (EDGE). Recent third-generation (3G) releases have improved the data rates on the ‘smart’ mobile phones. The main advantage of GSM is that it offers international roaming, essential for traveling businessmen. Overall, this means low business operation costs and high efficiency.

On the other hand, CDMA employs spread-spectrum technology and a special coding scheme (where each transmitter is assigned a code) to allow multiple users to be multiplexed over the same physical channel. Many codes occupy the same channel, but only users associated with a particular code can understand each other. CDMA has several unique features that make it a cost-effective, high quality wireless solution. Each BTS in a CDMA network can use all available frequencies. Adjacent cells can transmit at the same frequency because users are separated by code channels, not frequency channels. This feature of CDMA, called "frequency reuse of one," eliminates the need for frequency planning. Generally, CDMA has better bandwidth utilization since the same channel can be used several users. This reduces the cost of implementation for a service provider rolling out a CDMA network. Reduced costs for a service provider may have a ripple in reduced cost of service to the customer. In addition, CDMA features result in coverage that is between 1.7 and 3 times that of TDMA .Coding and interleaving provide the ability to cover a larger area for the same amount of available power used in other systems. Finally, coding provides security for the conversations in a channel since each channel has its own code and an interfering device cannot decode it.

GSM still stands as the most popular while CDMA remains unexploited. In Kenya, there are 4 GSM providers; Safaricom, Zain, Orange and YU, and 3 CMDA providers; Orange Fixed Plus, Flashcom and Popote Wireless, but the number of subscribers for the two standards differ significantly.

Wednesday, September 30, 2009

Video Conferencing and its History

The current globalization trends have made today’s business, government, educational, and medical organizations of all sizes need to the ability to communicate over different countries or even continents. One the ways in which these organizations communicate is by use of video conferencing technology. This means they need an intelligent network infrastructure and communications tools that support collaboration in real time, from anywhere in the world.

Videoconferencing solutions effectively eliminate the barriers of time, distance, and resources, permitting people around the world to function as if they were in the same room. Companies can integrate telecommuters, arrive at decisions faster. Educational institutions can interactively disseminate knowledge anywhere, creating a true “campus without walls” or provide students with the opportunity to learn by participating in a 2-way communication platform. Furthermore, teachers and lecturers from all over the world can be brought to classes in remote or otherwise isolated places. Students from diverse communities and backgrounds can come together to learn about one another. Doctors can consult specialists from any part of the world to provide the best care for their patients at a reasonable cost.

Organizations that “humanize” their communications in this way can reduce administrative costs and increase productivity, profitability, and competitiveness as never before. Kenyan companies have not been left behind in taking up video conferencing to reduce there operation costs. Recent upsurge in high bandwidth provision by service provider has been fuelling the companies’ ambition for a collaborative, interactive, cost effective and obviously fascinating technology like video conferencing. During the recent African Growth and Opportunity Act (AGOA) conference in at the Kenyatta International Conferencing Center (KICC) in Nairobi, the US President Barrack Obama addressed the delegates via video conferencing. Just to understand this technology, let’s look at its history.

This technology, video conferencing, has undergone a long revolution to what it’s become today. The dream of transmitting audio and video over the telephony network came to reality when digital telephony was introduced in the 1980’s. Integrated Services Data Subscriber (ISDN) lines offer data transmission over the normal telephony network and can carry voice and video, a standard known as H.320. ISDN is universally available in most places where telephony services are available and provides dedicated links for video and others for data. These are the major advantages of ISDN considering; an organization does not require to set up a Wide Area Network and provide quality of service (QoS) by provisioning bandwidth for different classes of traffic. However, ISDN has its limitations; its does not support endpoint monitoring features, doesn’t support network monitoring systems, bandwidth cannot be shared with other endpoints- additional endpoints will require additional ISDN lines and lack of redundancy.

In the 1990s, Internet Protocol (IP) based video conferencing was introduced to address the limitations of video conferencing on ISDN. Efficient video compression technologies such H.323, SIP e.t.c, were developed to permit desktop or personal computer (PC)-based video conferencing. Video conferencing was now possible on a Web browser (Web conferencing) leading to new player in the market that could offer the service; Meeting, MSN Messenger, Yahoo Messenger, SightSpeed and Skype. Web conferencing offered low quality video and was suitable for home users rather than corporate organizations, governments and research institutes.

Converged IP solutions came along in the late 1990s and early 2000 to address the needs of the corporate market. Voice, video, data and fax traffic were converged and would all be transmitted on a digital link. The convergence revolution was accompanied by sophisticated applications and devices, which demanded huge bandwidth quantities. Applications like Cisco Webex can do web conferencing, teleconferencing, chat, file sharing and indicate presence of users. New networking technologies were developed to carry the required bandwidth. Plesiochronous Digital Hierarchy (PDH) networking technology was upgraded to Synchronous Digital Hierarchy (SDH) which could transmit up to 565 Megabits per second. SDH transmission is mostly done on optical fiber connections (OFC).

Today, video conferencing is universally available on the web or from service providers. Video conferencing solutions can be high-end or low-end. For corporate, high-end solutions like Cisco Unified Communications are implemented. With bandwidths of 4 Megabits per second and more, telepresence systems are implemented, in which case the experience is more immersive and close to reality. For small home users, low-end solutions such D-Link DVC-1000 i2eye VideoPhone systems can be implemented. Simply connect a standard telephone and a television to the DVC-1000, plug in a standard Ethernet network cable connected to your broadband Internet connection and you are ready to conduct real-time videoconferencing.

Friday, August 28, 2009

NComputing

The recent global economic recession has compelled companies to cut down on their operating costs. Most multi-national companies like General Motors (GM), Toyota, Volkwagen, and so on had to close some of their plants or lay-off a number of workers. Small and medium enterprises had to pull back on their expansion or modernisation plans. However, the use of computers in the daily operations of a business could not be sidelined since it has become such an integral part of business activities. Be it in communication amongst the staff, control of huge machinery in a factory, monitoring a plant using security cameras, advertising on the Internet, keeping financial records and doing research on new products in the market, computers in have become inevitable in business operations.

One way of reducing the cost of business computing ,in this hard economic times, is by the use of a recent technology known as N- Computing. Today's computers are so powerful that the vast majority of applications only use a small fraction of the computer's capacity. Modern desktop computers sit idle while we check our e-mail, surf the web, or type a document. N-Computing technology taps the unused capacity so that it can be simultaneously shared by many users. Each user connects to the shared PC through an access device. The access device has no CPU, memory, or moving parts—so it is rugged, durable, and easy to deploy and maintain.
An example of an N-Computing product is the Userful Multiplier. It takes 1 ordinary desktop computer and turns it into 10. Just install the Linux-based software, add monitors, keyboards and mice - and you can support up to 10 independent users at the same time on a single computer. Userful Multiplier is a good solution for many industries that require to cut down of operations cost. Another example is the L-series from 'Ncomputing Ltd'. This is ideal if your users need to be more than 10 meters away from the shared computer. The L-series connects across a standard Ethernet local-area network, that is, by connecting via a switch or router. Users just snap into place and can be just about as far away from the shared computer as you like. As illustrated by these two examples, there are many products in the market that would suite your business needs.
This technology also has other areas of application. For instance, in education, it can be used to lower the cost of setting up a computer laboratory. Digital Solidarity Fund (DSF) from Switzerland has been using this technology to broaden the
scope of computer access in developing countries.
“NComputing offers huge potential to expand the reach of computer access
in developing nations,” says Mehdy Davary, director of DSF. “Even refurbished
computers can become expensive, not to mention the costs of keeping them
running. NComputing access devices require almost no maintenance and
that’s a huge plus.”

Wednesday, July 1, 2009

Role of Closed Source Software

Closed Source Software (CSS) has been the benchmark of software development for many years. Microsoft, for instance, has been used in training institutions to teach computer literacy programs. In the recent years, the advent of Open Source Software (OSS) has changed that position, especially in the emerging economies. OSS provides a cheaper means to an end. Many servers run on Linux and UNIX operating systems that require no or less expensive licenses.

However, CSS still has a major role to play in the ICT world. In fact, CSS and OSS can co-exist harmoniously and develop superior products. One good example is the Cisco Call Unified Manger that runs on a Linux box. This means that the service is cheaper thus affordable, though one has to pay for the Cisco software. Imagine Cisco had their own server to run the software; it would have been beyond reach in cost. Just to paint the picture, a single Unified Communications site requires 7 call manager servers. Thus an increase in server cost by say $200 leads to $1,400 increase in cost i.e. Shs.112, 000 per site.

Usually, CSS has resources to vet and troubleshoot projects that can eventually be packaged into products for the market. Some of these projects start as OSS projects and sometimes companies are formed around them. OSS encourages innovation, where developers come together to exchange ideas.

Monday, June 8, 2009

The Internet

The Internet is probably one of the most remarkable of human beings' innovation. Most of the people still don't know what it is or how it works. Technically, its just transmission of data over a network of encoded channels from a digital source to a digital sink. Lots of scientific research has gone behind ensuring that the message sent is received as intended.

Now, lets understand how it works. The most synonymous term to the Internet is a computer. It is often the source and destination. Digital systems, like computers, work on a different number system from humans. For digital systems, its known as binary system while for humans its called decimal system. The decimal number system has ten symbols; 0, 1, 2, 3, 4, 5, 6, 7, 8 and 9. A binary number system, one the other hand, has simply two symbols; 0 and 1, called bits. Simple ,right? These two symbols are used to represent anything, I mean anything – numbers, letters, words, images, videos, symbols, voice and so on.

To understand how this happens, we start with an illustration. Using two binary bits, we can represent decimal numbers 0, 1, 2 and 3 as follows; 00 01 10 11. The right most bit is called the Least Significant Bit (LSB) while the left most bit is called Most Significant Bit (MSB). From the LSB to MSB each step corresponds to an increase of the power of 2 by one. Similarly, in decimal number system, each step means an increase of the the power of 10 by one - say 10 and 100 means and increase from 101 to 102 . You follow?

Take an example. John from Texas, USA wants to send a number 2 to Otieno in Kisumu. When he presses 2 on the keyboard, the computer encodes it into two binary bits - 10 (or other code depending on the standard in use). The two bits are processed, encapsulated with the Internet Protocol (IP) address of John's Internet Provider Mail Server, say Verizon (a Mail Server is a high capacity computer dedicated to storage and processing of e-mails). The Mail server detects the destination e-mail address and sends the message to a Mail Server, say Wananchi On-line, in Kenya via satellite or fiber optic cable (undersea cable). The Mail Server knows Otieno's e-mail address and put it in his Inbox. When Otieno opens his Inbox, the computer decodes the binary bits into 2 for display on the monitor.

In practice the American Standard Code for Information Interchange (ASCII) code is used to represent any sort of information for transmission. The ASCII character table contains various translations from binary to character or decimal. For example, a decimal number 10 is converted into 00001010 and a character C is converted into 0101001. Its a bit complicated to represent images, voice and video - sampling and quantization techniques are used to do this.

Despite the type of information being sent, the basics in transmission are the same. There are differences where information requires encryption, compression and error correction. For instance; if a message is secret password then its encrypted for anyone else not to see it ,if its a video then its compressed to reduce on bandwidth requirements. If its a synchronization message, error detection flags are incorporated ,if an error is detected then error correction is carried out or retransmission of the message requested.