How Does a Television Work

A television, or TV, is an electronic system that transforms visual images with sound into electrical signals and displays them on a screen. TVs are vital forms of communication that provide news and entertainment to people all around the world.

How Do Televisions Work?

The tiny light dots on the TV screen, called pixels, flash according to a pattern transmitted by the video signal. The TV shows several tiny dots on the screen. When viewed as a whole, they appear like a picture. Old or new, most televisions work by the same principle.

The viewer’s eyes see the image, and the brain rearranges and interprets it as recognizable. The TV refreshes these patterns quicker than the human eye, which creates the illusion of a moving picture or video. The camera’s microphone transforms the electrical signal into an audio signal. Together, the video and audio signal constitutes the TV signal.

How Does a TV Signal Reach the TV Set?

A TV signal can reach the TV set through several media like antennas, cables, and satellites. Television stations release the shows that viewers watch through electromagnetic signals. Transmitters are set at high points to pick up and convert these signals into radio waves effectively.

A television antenna is an aerial that picks up over-the-air electromagnetic signals from television stations and broadcasts them on the TV through channels. Satellite televisions use signals from satellite dishes, usually mounted on the roof. In the case of cable TV, the signals are transmitted through a coaxial or fiber optic cable.

There is less compression with over-the-air signals, so antennas provide a clearer HDTV picture than satellites and cables. Conversely, the latter options compress the signal to cram and provide more channels to viewers.

Please look for Amita Vadlamudi’s other articles, pictures, and videos on the following web sites:

Amita Vadlamudi on Vimeo

Amita Vadlamudi on Scribd

Amita Vadlamudi on Weebly

Personal Computers

Laptop, Coffee, Arm, Desktop, Notebook, Writing

A personal computer or PC is a general-purpose, digital computer designed for a single end-user. A typical personal computer consists of one CPU (central processing unit) that handles all the device’s computing tasks. It also contains internal memory, storage, and input and output ports. It is connected to external devices such as a monitor, keyboard, mouse, printer, and network. Its operations are controlled by the software called the Operating System.

Business Computers vs. Personal Computers

Business-grade computers which are massive in physical size and computer power are used by large and small businesses, governments, and scientific and educational institutions. Personal computers, on the other hand, are made for individual use. They are lightweight, portable, and affordable by individuals.

Timeline of Computer Evolution

Although no exact date of its inception is known, it is generally accepted that the first computer was invented by Charles Babbage in 1822. That computer was, however, far from what a typical computer looks like today. Over the years, the computer has evolved dramatically. Let us examine this evolution.

First Generation

The early computers were created for business or scientific use. The first generation of computers worked by vacuum tubes and used magnetic drums to store memory. This is the period when the ENIAC (Electric Numerical Integrator and Computer) came into being. The ENIAC covered 1800 square feet and weighed 30 tons. There were several issues with the first round of computers, paving the way for improvement.

Second Generation

The second generation of computers replaced vacuum tubes with transistors as it was a more reliable mechanism. It helped spur the growth of smaller and more manageable computers, closer to the ones we see today.

Third Generation

However, the lifespan of transistors was short-lived as integrated circuits started becoming popular. This significantly improved the computers’ speed and efficiency, making them more affordable and accessible to a larger mass of people.

Fourth Generation

The fourth generation, which is the present age of computers saw the invention of microprocessors. Microprocessors helped the production of personal computers in a mass scale. Microprocessors contain thousands of integrated circuits on a single silicon chip, the CPU, input and output controls, and memory.

Today’s Personal Computers

The evolution of computers shows how accessibility and affordability created the personal computer, now used by a single user for work and entertainment. Personal computers were truly born in 1977, with the launch of three mass-produced personal computers: Apple Inc., the Personal Electronic Transactor (PET) of Commodore International, and the Tandy Radio Shack TRS-80, which all used microprocessors. Personal computers became affordable and popular during the 1980’s.

Conclusion

Personal computers changed the way people live and interact with one another today. They enable a user to engage in numerous activities, including work, internet browsing, online social interacting, gaming, multimedia streaming, and online shopping, just to list a few. The evolution of personal computers has been rapid and steady. It is hard to imagine life without a personal computer today.

About the Author: Amita Vadlamudi had a long career in the computer industry for over 35 years. After retiring from the field, Amita Vadlamudi now spends her time reading and blogging on various topics, including computers.

5G Cell Phone Technology

Network, 5G, The Internet, Technology, Free

5G is short for “5th Generation Mobile Network” and this is the next installation for global wireless standards after 1G, 2G, 3G and 4G technologies.

5G cell phone technologies have enabled a cutting edge network that can connect virtually everyone with everything including devices, objects and machines. 5G was already being used in different parts of the world during early 2019. In 2020, many other countries are expected to provide 5G access to their population.

How Does 5G Work?

5G utilizes OFDM technology (Orthogonal frequency-division multiplexing), which is a method of modulating digital signals across a variety of channels in order to reduce interferences. In addition to this, 5G also makes use of wider bandwidth technologies such as mmWave and sub-6 GHz.

Quite like 4G, 5G’s OFDM technology operates on similar mobile networking principles. Still, the 5G NR air interface has the ability to enhance OFDM to produce higher degrees of scalability and flexibility. In simpler words, 5G could provide access to more people and objects pertaining to a large variety of use cases.

How Is It Different From Previous Network Technologies?

Even though 1G, 2G, 3G and 4G inspired the creation of 5G, the latter is able to provide more connectivity than ever experienced before by network users.

5G not only has a more efficient air interface, but it is more unified and has an extended capacity to allow for next-gen user experiences. Additionally, 5G will also be able to deliver new deployment models and empower new services with negligible latency, superior reliability and high speed connectivity.

5G Applications

5G was specifically designed to not only deliver better and faster mobile broadband services as compared to 4G LTE, but also to broaden its reach into new areas of service such as connecting to the IoT or mission-critical communications. It is safe to say that 5G will make an impact on every industry, including digitized logistics, precision agriculture, remote healthcare, and many others.

Wi-Fi: Exploring the Digital Era

The onset of a digital era has brought about a myriad of inventions. One of the most commonly used technologies is the Wi-Fi. While millions of people use Wi-Fi every day, we can bet there is little knowledge about what exactly it is and how it works. Upending traditional networking methods, Wi-Fi carries a whopping 60% of internet traffic from all over the world.  

What is Wi-Fi?

 Wi-Fi stands for wireless fidelity. As the name suggests, it provides a wireless internet connection that uses radio frequencies to transmit signals between devices. Also known as Wireless Local Area Network (WLAN), Wi-Fi is actually a simpler term for its technical name, IEEE 801.11 technology.

How does it Work?

The main purpose of a Wi-Fi network is to provide connectivity to all devices connected to it. As mentioned above, Wi-Fi uses radio waves to operate. Radio waves are a type of electromagnetic radiation. Wi-Fi transmits and receives these radio waves in the Gigahertz range.

Wi-Fi requires a frequency of 2.4 Ghz to 5Ghz to perform optimally. There are a whole lot of invisible radio waves working to allow you to stream your favorite show with ease.

Benefits of Wi-Fi

Wireless networking is a burgeoning trend these days, owing to its convenience. One of the most obvious benefits of Wi-Fi is its wireless quality, rendering it a mobile way to stay connected, anywhere you may be.

Wi-Fi connections are affordable and easy to set, which explains why every household will most likely have a working Wi-Fi connection.

Wi-Fi is a popular choice as it ensures security. To avoid random people from using your personal connection, you can set a password to unlock WPA2, or Wi-Fi Protected Access. If you want to allow someone to connect to your Wi-Fi, you may open your ‘hotspot’,

Among the innumerable technological advances, Wi-Fi is undoubtedly one of the most useful inventions, allowing seamless connectivity.

Look for other articles by Amita Vadlamudi on this website, as well as the following:

Amitavadlamudi.net

Amitavadlamudi.org

What is Bluetooth, and how does it work?

bluetooth-1330140_1280

Bluetooth refers to a type of short-range wireless communications technology. The main focus of developing Bluetooth technology was to replace cables that connect different devices to transfer data. Bluetooth operates to establish high speed and a low-powered wireless link between devices.

The wireless link is created through low power radio waves.  Wireless signals transmitted using Bluetooth usually covers a short distance, generally up to 30 feet. The wireless connection between Bluetooth compatible devices is achieved through low cost and low power transceivers integrated into the machines.

Bluetooth uses the 2.45GHz frequency band to establish a connection between the transceivers devices. The technology can support up to 721KBps of data transfer rate between devices connected wirelessly as well as three voice channels. The 2.45GHz frequency band in use for Bluetooth compatible devices is a standard practice throughout the world.

The international agreement for the use of industrial, scientific, and medical devices (ISM) set aside the frequency band for Bluetooth technology to establish a worldwide standard. Bluetooth can be used to connect up to eight devices at the same time, depending on the capabilities of the devices connected.

Each of the eight devices connected offers a unique 48-bit address from the IEEE 802 standard. The connections between devices using Bluetooth can either be multipoint or point-to-point.

Bluetooth technology is understood to have emerged from a task undertaken by telecommunications company Ericsson Mobile Communications back in 1994. The company was tasked with finding alternative methods for communications between devices to connect mobile phones and other devices.

In 1998, Ericsson established the Bluetooth Special Interest Group in combination with Toshiba, Nokia, and IBM. The collective, known as SIG, released the first version of this technology in 1999.

The name Bluetooth was given to this technology after Danish Viking King, Harald Blatand. His last name translated to Bluetooth in English. The monarch is credited with uniting Denmark and Norway, similarly to how Bluetooth can connect two devices.

A Bluetooth network consists of a minimum of two and a maximum of eight devices. A master device is responsible for initiating communication with other devices. Typically, a master device can connect to up to seven slave devices. The master device governs the interaction between different devices, establishing the communication link and traffic between itself and the slave devices it is connected to.

Typical uses of Bluetooth technology include but are not limited to, transfer of multimedia between devices and automatic synchronization of data between devices. Wireless connectivity possible through Bluetooth can result in a myriad of device-specific applications across various types of technological devices.

 

Amita Vadlamudi has published many other articles at some of her other web sites, including the following:

Amitavadlamudi.org

Aboutamitavadlamudi.org

Cell Phone Technologies

 

mobile-phone-2000081_1280

What exactly are GSM, CDMA and LTE? Most people understand that these terms refer to cellular network technology but they are little more than buzzwords.

GSM and CDMA are two different technologies which were developed in the early 90s for 2G networks along with the second generation of cell phones. The two connectivity options were different. As the engineers could not decide which one was better, the Federal Communications Commission decided on incorporating both types of networks in a dual mode. This means the US consumers need to choose between CDMA and GSM when they sign up for service.  The rest of the world, however, decided simply on GSM.

With improvements in technology in the 3G era, both CDMA and GSM improved and the FCC chose to retain the dual mode. With the 4G era arriving, a new system was created called the LTE network.

Most of the new phones in the market use LTE for data while GSM and CDMA are still used for voice calls and texting. This means US consumers still need to decide between either of the two cellular technologies. But with the advent of 4G and 5G technologies, voice and text capabilities are being added to LTE, making GSM and CDMA to be soon obsolete.

The primary reason why GSM is preferred over CDMA around the world is because of the fact that GSM phones use removable SIM cards. SIM cards port the phone number and the plan across phones.

CDMA phones have embedded serial numbers which must be linked to the carrier. This makes one’s phone tied to the carrier and switching between carriers becomes difficult.

The LTE technology also uses SIM cards like GSM and is expected to replace GSM and CDMA in the near future. Therefore, it will soon be easier for the US consumers to switch between carriers using the same physical phone.

 

About the Author:

Amita Vadlamudi has worked in the Information Technology Field for over 3 decades, supporting IBM mainframes and Unix computer systems. Amita Vadlamudi’s current interests are researching and writing about various technologies.

 

 

IBM Mainframes

Mainframes

IBM mainframes are large computers that have been produced by the technology giant IBM since 1952. Their production of the mainframe computers saw IBM dominate the industry throughout the 60s and the 70s. IBM is still the major player in the industry today.

IBM mainframes have always been and still are huge when it comes to hardware size. It’s been one of their most telling characteristics with a single unit being larger than a  refrigerator.

Computers used commonly in the modern era are a lot more compact than the IBM mainframes but they did not make the IBM mainframes become obsolete. One might think that such bulky hardware no longer has any use but that is far from the truth. In a time where data processing has become much faster and more efficient than what it used to be, IBM has maintained an edge over other manufacturers when it comes to producing ultra high performance data processing supercomputers.

IBM mainframes play a central role in the day-to-day operations of most of the largest corporations in the world. While other forms of computing are still extensively used in business operations, the IBM mainframes have their own place in the business environment serving various purposes. Every industry, including health, banking, finance and even governments consider IBM mainframes to be one of their foundations.

Until the 1990s, IBM mainframes processed large amounts of data for businesses. With steady and increasing technological advancements, IBM mainframes are now capable of running large and complex programs like general ledger processing and payroll generation software.

The primary operating systems used in the current IBM mainframes include z/OS, z/VM, z/VSE, z/TPF and Linux.

 

Look for other articles by Amita Vadlamudi on her various Websites, including the following:

aboutamitavadlamudi.org

Amita Vadlamudi on Wix

What is Internet?

Internet

February 7, 1958 was a very important moment in the internet history as it marks the day when Secretary of Defense Neil McElroy signed Department of Defense Directive. Owing to his signature, the Advanced Research Projects Agency (ARPA), which is now called DARPA or Defense Advanced Research Projects Agency, came into being. The establishment of this agency led to the creation of the internet as we know today.

During the cold war in the 1950s, USSR (Soviet Russia) launched Sputnik – a satellite. The United States was apprehensive about the USSR’s rising power in science. The Americans were concerned about being attacked by The Soviet Union that possessed the ability to damage their long-distance communication networks. In 1962, during a nuclear attack, J.C.R Licklider – a notable ARPA scientist – came forward with the genius plan to keep the communication network alive by connecting computers. This network was later named, the ARPA Network or ARPAnet. The method of packet switching enabled the transmission of data in 1965. Next year in 1969, a military company – Bolt, Beranek, and Newman (BNN) created networking devices known as interface message processors (IMPs), which fundamentally transformed the procedure of data transmission.

The creation of ARPAnet gave way to the development of electronic mail. In 1971, Ray Tomlinson invented electronic mail. However, the procedure of sending one message to another network became a complex one. The creator of electronic email came up with the idea of using the”@” symbol. By 1989, hypertext – the basis of the World Wide Web – was invented. With the advancement of internet, the technology – from being a vague research idea became an integral part of everybody’s daily life.

The internet has a unique address known as an IP address which can be served as both temporary and permanent. The internet from LAN – Local Area Network may offer a permanent IP address while a DHCP (Dynamic Host Configuration Protocol) provides a temporary IP address. Whether temporary or permanent, the IP address helps transfer all types of information from one computer to another. The message that is transmitted to another computer is transferred through the internet wire. The message is translated from a human language to a computer language and then is translated back to the alphabetic text. This is accomplished through the use of the communication protocol – protocol stacks.

The internet infrastructure is composed of a wide array of networks that interrelate with each other. The group of these networks is called – Network Service Provides or NSPs. Some of the commonly known NSPs include CerfNet, UUNet, IBM, SprintNet, and BBN Planet etc. These varying networks connect with each other to maintain and advance the traffic flow. The packet traffic is transferred from one NSP’s network to another in order to reach their final destination. The NAPs interlink with Metropolitan Area Exchanges or MAEs as well. MAEs and NSPs have a similar function only that former ones are privately owned. Both NAPs and MAEs are known as Internet Exchange Points or IXs.

Today, the internet is used for a variety of purposes including online education, engaging games, research purposes, job-hunting, online shopping, and social networking.

 

Amita Vadlamudi worked in the Information Technology field for over 30 years and is well-versed in technology related literature. More information about Amita Vadlamudi’s professional credentials may be found on her About.me web site.