send mail to support@abhimanu.com mentioning your email id and mobileno registered with us! if details not recieved
Resend Opt after 60 Sec.
By Loging in you agree to Terms of Services and Privacy Policy
Claim your free MCQ
Please specify
Sorry for the inconvenience but we’re performing some maintenance at the moment. Website can be slow during this phase..
Please verify your mobile number
Login not allowed, Please logout from existing browser
Please update your name
Subscribe to Notifications
Stay updated with the latest Current affairs and other important updates regarding video Lectures, Test Schedules, live sessions etc..
Your Free user account at abhipedia has been created.
Remember, success is a journey, not a destination. Stay motivated and keep moving forward!
Refer & Earn
Enquire Now
My Abhipedia Earning
Kindly Login to view your earning
Support
THE INFORMATION REVOLUTION
Digital India Project
The Digital India Project, a pet scheme of the Modi government, aims to connect all gram panchayats by broadband internet, promote e-governance and transform India into a connected knowledge economy. Digital India is a massive tech push to provide electronic governance and universal phone connectivity across the country. The aim is to bridge India's digital divide, bringing in large investments in technology manufacturing. The Digital India campaign would see an investment of Rs 4.5 lakh crore and jobs for at least 18 lakh people as the country moved from “e-governance to m-governance (mobile-governance)”. The government plan aims to stop net imports of technology and electronics by 2020, while creating over 100 million jobs.
The challenges are many. India's average internet speed was ranked 115th globally earlier this year among countries studied by services provider Akamai Technologies. And India had just a little over 100 million broadband subscribers at the end of April.
Key Initiatives:
Optical fibre
Fibre-Optics is the Branch of science which deals with transfer of information from one point to another using fibres is called fibre optics. It uses s lights wave as carrier waves of signals. It is based on the principle of total Internal reflection. Optical fibre is a thin, cylindrical cable made of glass fibre used for transmitting light signals. Light is propagated down an optical fibre by refraction and internal reflection. The Core has high refractive index. Cladding has less refractive index in comparison to core.
There are two types of fibres, one in which the central core has uniform refractive index (Step Index) and the other where there is a gradual change in the refractive index(Graded Index) decreasing outward from the centre of the fibre.
Before they are sent through an optical fibre, electrical signals are converted into pulses of light by semiconductor devices known as light emitting diode (LED) and laser diode.
At the far end, light is reconverted into electrical signals by a receiver which consists of a photodiode. Signals can be transmitted in both directions provided that both ends of the fibre have a source and a receiver.
The main advantages of optical fibres in transmission of information are its larger channel capacity, lighter weight compared to copper cables and no possibility of a cross talk between adjacent cables. Besides, these cables do not pick up any noise signals and can withstand extreme environmental conditions. Thus it has no Interference from outer environment. They are even cheaper than copper cables. They have lesser signal loss and large signal carrying capacity
communication technologies
Mobile satellite telecommunications is being pioneered by the International Maritime Satellite Communication Organization (Immarsat), the global mobile satellite communication organization. Inmarsat is an intergovernmental organization having 76 member nations, headquartered in London. It was set up in 1979 to serve the maritime community, but has now expanded into being the sole provider of world wide mobile satellite communications for commercial and distress and safety applications at sea, on land and in the air. India is a founder member of Immarsat. Videsh Sanchar Nigam is the signatory to Immarsat on behalf of the government of India. Immarsat is able to support service including direct dial telephone, telex, facsimile, electronic mail and data communications between mobile users and subscribers around the world connected to the International Public Switched Network.
The three essential components of this system are:
Hand-held satellite telephone - Inmarsat-P: The advent of the hand-held satellite telephone Immarsat-P will be the realization of one of mankind’s ultimate dreams - the ability to communicate instantly and effortlessly to and from any place on Earth. A new generation of advanced satellites will be required to deliver Immarsat-P services. The complex issues encompassing the choice of available satellite technology have led Inmarsat to decide and focus on two of the most promising designs - intermediate circular orbit (ICO) and geostationary orbit (GSO). The choice of spacecraft constellation will be ultimately the quality and price of any service. Mobile satellite communications is likely to be one of the most exciting areas to be watched in the realm of communications.
GSM, which stands for Global System for Mobile communications, reigns as the world’s most widely used cell phone technology. Cell phones use a cell phone service carrier’s GSM network by searching for cell phone towers in the nearby area.
The origins of GSM can be traced back to 1982 when the Groupe Spécial Mobile (GSM) was created by the European Conference of Postal and Telecommunications Administrations (CEPT) for the purpose of designing a pan-European mobile technology.
It is approximated that 80 percent of the world uses GSM technology when placing wireless calls, according to the GSM Association (GSMA), which represents the interests of the worldwide mobile communications industry. This amounts to nearly 3 billion global people.
AT A GLANCE
The first wireless mobile telecommunications technologies are referred to as 1G which originated in the 1980s. They were predominantly analog and provided only voice services.
2G refers to the second generation of mobile telecommunications technologies. The most well known 2G technology is GSM. The first CDMA standard (cdmaOne) was also a 2G technology. Being digital, 2G technologies not only provide voice services, they can also provide data transfer by Short Messaging Service. 2G technologies also provide greater security to the users as the communications are encrypted.
There are certain intermediate technologies between 2G and 3G that are referred to as 2.5G and 2.75G. The main intermediate technologies are GPRS and EDGE. Both provide high speed data transfer on existing 2G networks enabling Internet access on mobile phones.
GPRS: General Packet Radio Service (GPRS) is a mobile data service available to all users of GSM technology. GPRS provides data rates of 56-114 kbps. It enables use of Internet on mobile phones, and multimedia messaging service (MMS.)
EDGE: Stands for ‘Enhanced Data rates for GSM Evolution’. Also known as Enhanced GPRS or EGPRS. It is a mobile data service available to all users of GSM technology. It is considered a 3G technology. It is more sophisticated than the earlier GPRS and provides more than three-fold increase in both the capacity and performance of GPRS. It can provide data rates of upto 384 kbps, allowing Internet access at higher speeds. The existing GSM infrastructure is not required to be changed.
3G STANDARDS: The two main 3G standards are:
UMTS: Universal Mobile Telecommunications Service. It is an evolution of the 2G GSM standard. It was introduced in 2001. It is used primarily in Europe, Japan, China and other regions predominated by GSM 2G system infrastructure. This is the technology that is currently in use by BSNL and MTNL for their 3G services.
CDMA 2000: It was introduced in 2002. It is used mainly in North America and South Korea and other regions predominated by CDMA (cdmaOne) infrastructure.
FEATURES OF 3G:
Provides voice, SMS and data transfer at speeds much higher than GRPS and EDGE, enabling users to access Internet at broadband speeds, on mobile phones. The most well known 3G feature is video telephony i.e. simultaneous voice and non-voice data transfer. From development point of view 3G can: be great booster in areas such as Telemedicine, Mobile Banking, inclusive banking such as taking banking and financial services to remote villages, help provide weather updates and market price information to farmers, it can provide TV and entertainment services for remote ' villages. It can also be used in disaster management, as a cheap and effective means d coordinating rescue and relief efforts with the use of live images.
4G 4G Technology is basically the extension in the 3G technology with more bandwidth and services offers in the 3G. 4G technology is one that satisfies the requirements of the IMT-Advanced (International Mobile Telecommunications Advanced) standard, as defined by the International Telecommunications Union (ITU.) According to the ITU, an IMT-Advanced cellular system must have data transfer speeds of up to approximately 100 Mbps for mobile users and up to approximately 1 Gbps for fixed users. A 4G system is expected to provide facilities such as IP telephony, ultra-broadband Internet access, gaming services and streamed multimedia, and even HDTV (high definition TV.)
The main objective of this generation is to converge all the technologies into one another with simplified structure, any user can use his respective services even on the phone line, and similarly he can use his broadband services on mobile. In short you may say that any network can be accessed on any terminal. The main technologies of 4G are LTE (Long Term Evolution), WiMax (Worldwide Interoperability for Microwave Access), UMB (Ultra Mobile Broadband) and EV-DO (Rev. C). The advantage of this Generation is that they can also communicate with 3G technologies and also even with GSM Phase 2+
Aircel and Bharati Airtel are offering these services in India.
Direct to Home (DTH) broadcasting is the technology that allows people to receive television programmes in the Ku Satellite transmission band in their homes, directly via small dish antenna. In Traditional method television programme is send out from terrestrial transmitter and received with small & cheap half wave dipole antennas LPT (Low power transmitter) is used. In satellite based television services local cable operator receive prog (channels) using 1 or more 3-4 m dIa antennas and send them to individual homes using coaxial cable. C. Band (3.4 - 6.65 GHz) is used for transmission.
India on 2nd November 2000 permitted the introduction of Direct-to-Home services in the country.
The technology of DTH ensures highest quality of voice, video as it uses digital form of communication system. The DTH service also provides various value added services such as internet services, telemedicine, video conferencing etc. Since in DTH technology, all the encoded transmission signals are digital. Thus, it provides higher resolution picture quality and better audio than traditional analog signals. In recent years DTH has become the buzzword in the satellite broadcasting industry because of immense opportunities it offers to broadcasters and viewers.
Computer networks
A computer network, often simply referred to as a network, is a collection of computers and other hardware components interconnected by communication channels that allow sharing of resources and information. Where at least one process in one device is able to send/receive data to/from at least one process residing in a remote device, then the two devices are said to be in a network. Simply, more than one computer interconnected through a communication medium for information interchange is called a computer network.
Networks may be classified according to a wide variety of characteristics, such as the medium used to transport the data, communications protocol used, scale, topology, and organizational scope.
Communications protocols define the rules and data formats for exchanging information in a computer network, and provide the basis for network programming. Well-known communications protocols include Ethernet, hardware and link layer standard that is ubiquitous in local area networks, and the Internet protocol suite, which defines a set of protocols for internetworking, i.e. for data communication between multiple networks, as well as host-to-host data transfer, and application-specific data transmission formats.
Facilitate communications
Share network and computing resources
May be insecure
May interfere with other technologies
May be difficult to set up
Lan is the short form of Local Area Network. It allows the linking together of several computers within a building. This interconnection of various computer terminals located in close proximity such as an industrial complex, an office building and a university campus enables each terminal to interact with any other. LAN is, therefore, a multi-user system. Not only is the exchange of data among the various linked up computers made possible by LAN but resources from a large computer can also be shared. LAN thus offers the most effective means of handling local automated tasks and data management.
LAN can be even linked with an outside computer network using 'gateway'. The gateway converts the formats of data to make them compatible to the two networks.
Internet is the abbreviation of Internetwork System and is described as a network of networks. Internet is the most important information networks available in the world. Internet is a worldwide computer network that contains a large collection of information which could be made available to you on your computer. The largest and most complete learning tool for a group of people with varied educational backgrounds and interests.
SUPER COMPUTERS
A computer having a very high computing speed several thousand times faster than that of conventional ones is called as super computer.
The term supercomputing was first used by New York world newspaper in 1929 to refer to large custom-built tabulators that IBM had made for Columbia University. Super Computers introduced in the 1960s, were designed primarily by Seymour Cray at Control Data Corporation and led the market into 1970s.
Super computer will usually have more than one central processing unit (CPU), which allows the computer to do faster circuit switching and accomplish more tasks at once. (Because of this, a supercomputer will also have an enormous amount of storage so that it can access many tasks at a time.) It will also have the capability to do vector arithmetic, which means that it can calculate multiple lists of operations instead of just one at a time.
There are two major parts to a supercomputer, and they are the central processing unit, also known as the CPU, and the memory. The role of the CPU is to carry out instructions and the memory is where the instructions and data are stored. Those engineers who design supercomputers use high performance circuits and architectures to make CPU’s that are 10 to 20 times faster than the top of the line CPU’s used for other commercial computers.
Supercomputers are used for many different things in today’s society.
Some examples of some super computers are Road runner built by IBM and Param Padam developed by C-DAC of India.
Indian attempts are aimed at developing super computers based on fronted technology but they are attempted to speeding up computers using slower processors.
Scientists at the advanced Numerical Research and Analysis Group (ANURAG) of the Defence Research and Development organization have developed the high-speed, user friendly PACE (Processor for Aerodynamic Computation and Evaluations) system.
This DRDO's PACE project was primarily intended for fluid dynamics studies which required high computational speeds. Later on PACE plus super computer developed by the DRDO which was launched in 1995-96. After PACE super computers, scientists at the center for Development of Advanced Computing developed the new model of PARAM super computer, capable of making 1,00,000 million calculations per second.
Supercomputer Education and Research Centre (SERC) was established in 1990 to provide state-of-the-art computing facility to the faculty and students of the Indian Institute of science. Apart from functioning as a central computing facility of IISc, the SERC is engaged in education and research programs in areas relating to supercomputer development and application.
India has made a minor improvement from nine supercomputers in the list released last year to 11 in the 2015 list.
India’s fastest supercomputer Pratyush installed at Pune's Indian Institute of Tropical Meteorology.
Peak performance: 6.8 Peta Flops
Application: Will help weather analysis reach international standards, and attain improved predictions and warnings of natural disasters.
National Supercomputing Mission India’s position can be expected to improve as the cabinet has approved the National Supercomputing Mission with an outlay of Rs.4,500 crore over a period of seven years. The mission aims to set up a grid connecting 70 supercomputers located in research and development institutions, universities and the 1 million core cloud using the National Knowledge Network. It will come up with association of the Department of Science and Technology (DST) and the Department of Information and Technology (DIT).
The mission also aims to catalyze the government’s Digital India vision by making available huge data storage space and linking systems. However, there are challenges before the plan is undertaken at a full-scale level. First would be building both software and hardware infrastructure for such a large-scale project and making applications for these supercomputers. More importantly, having man power for running these super computers would also be a challenge as it would require training them.
PARAM 8000 is considered India's first supercomputer. It was indigenously built in 1990 by Centre for Development of Advanced Computing
Hosed in the Terascale Super-computing Facility at the Centre for Development of Advanced Computing (C-DAC’s), Knowledge Park in Banglore, this next generation of the now well-known Param series has been nicknamed Param Padma. Padma in Indian mythology stands for 10 to the power of 12. One teraflop or 1000 Gigaflops stands for one trillion floating point operations per second. The Param Padma stands at 500,000 motps and boasts of 250 processors. What is of real pride is that two crucial technologies that control the teraflop machine have been indigenously developed.
PARAM Yuva II supercomputer has also been built by the Centre for Development of Advanced Computing (C-DAC)
Rank
Site
Name
96
Indian Institute of Science
SahasraT (SERC - Cray XC40)
119
Indian Institute of Tropical Meteorology
Aaditya (iDataPlex DX360M4)
145
Tata Institute of Fundamental Research
TIFR - Cray XC30
166
Indian Institute of Technology Delhi
HP Apollo 6000 Xl230/250
251
Centre for Development of Advanced Computing
PARAM Yuva - II
286
Indian Institute of Technology Kanpur
Cluster Platform SL230s Gen8
300
CSIR Centre for Mathematical Modelling and Computer Simulation
Cluster Platform 3000 BL460c Gen8
313
National Centre for Medium Range Weather Forecasting
iDataPlex DX360M4
316
IT Services Provider
Cluster Platform SL250s Gen8
380
Network Company
397
Cluster Platform SL210T
The TOP500 celebrates its 25th anniversary in 2018 with a major shakeup at the top of the list. For the first time since November 2012, the US claims the most powerful supercomputer in the world, leading a significant turnover in which four of the five top systems were either new or substantially upgraded
Summit, an IBM-built supercomputer now running at the Department of Energy’s (DOE) Oak Ridge National Laboratory (ORNL), captured the number one spot with a performance of 122.3 petaflops on High Performance Linpack (HPL)- the benchmark used to rank the TOP500 list. Summit has 4,356 nodes, each one equipped with two 22-core Power9 CPUs, and six NVIDIA Tesla V100 GPUs. The nodes are linked together with a Mellanox dual-rail EDR InfiniBand network.
Sunway TaihuLight, a system developed by China’s National Research Center of Parallel Computer Engineering & Technology (NRCPC) and installed at the National Supercomputing Center in Wuxi, drops to number two after leading the list for the past two years. Its HPL mark of 93 petaflops has remained unchanged since it came online in June 2016.
Sierra, a new system at the DOE’s Lawrence Livermore National Laboratory took the number three spot, delivering 71.6 petaflops on HPL. Built by IBM, Sierra’s architecture is quite similar to that of Summit, with each of its 4,320 nodes powered by two Power9 CPUs plus four NVIDIA Tesla V100 GPUs and using the same Mellanox EDR InfiniBand as the system interconnect.
Tianhe-2A, also known as Milky Way-2A, moved down two notches into the number four spot, despite receiving a major upgrade that replaced its five-year-old Xeon Phi accelerators with custom-built Matrix-2000 coprocessors. The new hardware increased the system’s HPL performance from 33.9 petaflops to 61.4 petaflops, while bumping up its power consumption by less than four percent. Tianhe-2A was developed by China’s National University of Defense Technology (NUDT) and is installed at the National Supercomputer Center in Guangzhou, China.
The new AI Bridging Cloud Infrastructure (ABCI) is the fifth-ranked system on the list, with an HPL mark of 19.9 petaflops. The Fujitsu-built supercomputer is powered by 20-core Xeon Gold processors along with NVIDIA Tesla V100 GPUs. It’s installed in Japan at the National Institute of Advanced Industrial Science and Technology (AIST).
Piz Daint (19.6 petaflops), Titan (17.6 petaflops), Sequoia (17.2 petaflops), Trinity (14.1 petaflops), and Cori (14.0 petaflops) move down to the number six through 10 spots, respectively.
Virtual Reality (VR), system that enables one or more users to move and react in a computer-simulated environment. Various types of devices allow users to sense and manipulate virtual objects much as they would real objects. This natural style of interaction gives participants the feeling of being immersed in the simulated world. Virtual worlds are created by mathematical models and computer programs.
It is basically an advanced version of computer graphics designed on the basis of real life or imaginary situations.
Flight simulators that allow pilot trainees to learn flying without ever leaving the ground also create a virtually real environment of the cockpit of an actual aeroplane.
Thus all conditions of an actual flight are duplicated so that a trainee can gain experience without taking the risk of flying a real aeroplane.
Virtual reality can lead to new and exciting discoveries in these areas which impact upon our day to day lives. Wherever it is too dangerous, expensive or impractical to do something in reality, virtual reality is the answer. From trainee fighter pilots to medical applications trainee surgeons, virtual reality allows us to take virtual risks in order to gain real world experience. As the cost of virtual reality goes down and it becomes more mainstream you can expect more serious uses, such as education or productivity applications, to come to the fore. Virtual reality and its cousin augmented reality could substantively change the way we interface with our digital technologies, continuing the trend of humanizing our technology.
Major applications
Military uses of virtual reality-these include flight simulation, battlefield simulation, medic training and vehicle simulation. Virtual reality is designed to be used as an additional aid and will not replace real life training.
Virtual Reality and Education-It is able to present complex data in an accessible way to students which is both fun and easy to learn. For example, astronomy students can learn about the solar system. This is useful for students who have a particular learning style, e.g. creative or those who find it easier to learn using symbols, colours and textures.
Virtual Reality in Healthcare- allows healthcare professionals to learn new skills as well as refreshing existing ones in a safe environment. Plus it allows this without causing any danger to the patients.
Virtual robotic surgery-where surgery is performed by means of a robotic device – controlled by a human surgeon, which reduces time and risk of complications. Virtual reality has been also been used for training purposes and, in the field of remote telesurgery in which surgery is performed by the surgeon at a separate location to the patient. The main feature of this system is force feedback as the surgeon needs to be able to gauge the amount of pressure to use when performing a delicate procedure.
Virtual Reality in Sport- used as a training aid in many sports such as golf, athletics, skiing, cycling etc. It is used as an aid to measuring athletic performance as well as analyzing technique and is designed to help with both of these. It also used in clothing/equipment design and as part of the drive to improve the audience’s experience.
Virtual Reality and Scientific Visualization-This field is based upon using computer graphics to express complex ideas and scientific concepts, for example molecular models or statistical results.
Branch of science concerned with creating computer programs that can perform actions comparable with those of an intelligent human. Current AI research covers such areas as planning (for robot behaviour),language understanding, pattern recognition, and knowledge representation. The possibility of artificial intelligence was first proposed by the English mathematician Alan Turing in 1950.
It is now thought that intelligent behaviour depends as much on the knowledge a system possesses as on its reasoning power. Present emphasis is on knowledge-based systems, such as expert systems, while research projects focus on neural networks, which attempt to mimic the structure of the human brain.
On the Internet, small bits of software that automate common routines or attempt to predict human likes or behaviour based on past experience are called intelligent agents or bots.
The most important fields of research in this area are information processing, pattern recognition, game-playing computers, and applied fields such as medical diagnosis.
In medicine, programs have been developed that analyze the disease symptoms, medical history, and laboratory test results of a patient, and then suggest a diagnosis to the physician.
The diagnostic program is an example of so-called expert systems—programs designed to perform tasks in specialized areas as a human would.
Many scientists remain doubtful that true AI can ever be developed. The operation of the human mind is still little understood, and computer design may remain essentially incapable of analogously duplicating those unknown, complex processes.
Various routes are being used in the effort to reach the goal of true AI. One approach is to apply the concept of parallel processing—interlinked and concurrent computer operations. Another is to create networks of experimental computer chips, called silicon neurons, that mimic data-processing functions of brain cells. Using analog technology, the transistors in these chips emulate nerve-cell membranes in order to operate at the speed of neurons.
CLOUD COMPUTING
Cloud computing is a general term for anything that involves delivering hosted services over the Internet. These services are broadly divided into three categories: Infrastructure-as-a-Service (IaaS), Platform-as-a-Service (PaaS) and Software-as-a-Service (SaaS). The name cloud computing was inspired by the cloud symbol that's often used to represent the Internet in flowcharts and diagrams.
A cloud service has three distinct characteristics that differentiate it from traditional hosting. It is sold on demand, typically by the minute or the hour; it is elastic -- which means that a user can have as much or as little of a service as they want at any given time; and the service is fully managed by the provider (the consumer needs nothing but a personal computer and Internet access). Significant innovations in virtualization and distributed computing, as well as improved access to high-speed Internet and a weak economy, have accelerated interest in cloud computing.
A cloud can be private or public. A public cloud sells services to anyone on the Internet. (Currently, Amazon Web Services is the largest public cloud provider.) A private cloud is a proprietary network or a data center that supplies hosted services to a limited number of people. When a service provider uses public cloud resources to create their private cloud, the result is called a virtual private cloud. Private or public, the goal of cloud computing is to provide easy, scalable access to computing resources and IT services.
Infrastructure-as-a-Service provides the customer with virtual server instances and storage, as well as application program interfaces (APIs) that allow the customer to start, stop, access and configure their virtual servers and storage. This model allows a company to pay for only as much capacity as is needed, and bring more online as soon as required. Because this pay-for-what-you-use model resembles the way electricity, fuel and water are consumed, it's sometimes referred to as utility computing.
Platform-as-a-service in the cloud is defined as a set of software development tools hosted on the provider's infrastructure. Developers create applications on the provider's platform over the Internet. PaaS providers may use APIs, website portals or gateway software installed on the customer's computer. Force.com, (an outgrowth of Salesforce.com) and GoogleApps are examples of PaaS. Developers need to know that currently, there are not standards for interoperability or data portability in the cloud. Some providers will not allow software created by their customers to be moved off the provider's platform.
In the software-as-a-service cloud model, the vendor supplies the hardware infrastructure, the software product and interacts with the user through a front-end portal. SaaS is a very broad market. Services can be anything from Web-based email to inventory control and database processing. Because the service provider hosts both the application and the data, the end user is free to use the service from anywhere.
Grid systems are designed for collaborative sharing of resources. It can also be thought of as distributed and large-scale cluster computing. A Grid is basically the one that uses the processing capabilities of different computing units for processing a single task. The task is broken into multiple sub-tasks, each machine on a grid is assigned a task. As when the sub-tasks are completed they are sent back to the primary machine which takes care of the all the tasks. They are combined or clubbed together as an output.
A computer virus is a self replicating computer program or a segment of code that inserts copies of itself into other programs, thus infecting them. As a result, the original programs cannot run smoothly, thereby totally disrupting the functioning of the computer.
When an infected program is run on a computer, the hidden virus is activated and attempts to inject itself into additional programs. This process is analogous to the biological process of virus spreading from cell to cell within an organism. Very much like the biological virus, a single computer virus can spread to any number of compatible computer systems if provided with a pathway for infection.
The virus infects a system through sharing of infected diskette or through communication links, like local area networks or programs transmitted from distant systems.
Some of the commonly destructive manifestations of a computer virus include erasure of recorded data, rendering of entire device drives unreadable, interfering with communication and breaking the security cordon of the host computer. The virus may also create other problems such as display of unusual messages or plotting the text backward. Some of the well known viruses are Pakistani Brain, Lehligh, Friday the 13, Christmas and Bloody.
Scanning software looks for a virus in one of two ways. If it’s a known virus (one that has already been detected in the wild and has an antidote written for it) the software will look for the virus’s signature — a unique string of bytes that identifies the virus like a fingerprint — and will zap it from your system. Most scanning software will catch not only an initial virus but many of its variants as well, since the signature code usually remains intact.
In the case of new viruses for which no antidote has been created, scanning software employs heuristics that look for unusual virus like activity on your system. If the program sees any funny business, it quarantines the questionable program and broadcasts a warning to you about what the program may be trying to do (such as modify your Windows Registry). If you and the software think the program may be a virus, you can send the quarantined file to the antivirus vendor, where researchers examine it, determine its signature, name and catalog it, and release its antidote. It’s now a known virus.
If the virus never appears again — which often happens when the virus is too poorly written to spread — then vendors categorize the virus as dormant. But viruses are like earthquakes: The initial outbreak is usually followed by aftershocks. Variants (copycat viruses that emerge in droves after the initial outbreak) make up the bulk of known viruses.
Within a few hours of when the Love Letter virus first appeared in the United States, a variant — VeryFunnyJoke — had already appeared, followed by more than 30 others during the next two months. And not all variants stem from mysterious writers. More than a few companies have been infected by variants created by a curious employee who fiddled with a virus he or she received, created a new strain of it, and unleashed it onto the company’s system—sometimes accidentally, sometimes not.
Computer viruses are the “common cold” of modern technology. They can spread swiftly across open networks such as the Internet, causing billions of dollars worth of damage in a short amount of time.
IT industry in India
The IT and ITeS sector comprise of services that are related to information technology, research and development services as well as engineering designs, hardware and BPO.
IT: The application of computers and telecommunication equipment to store, transmit, retrieve, and manipulate data, in context of business or an enterprise.
ITeS: Information technology enabled services (ITES), is a form of outsourced service which has emerged due to involvement of IT in various fields such as banking, finance, telecom, insurance among others. Some of the examples of ITES are medical transcription, back-office accounting, insurance claim, credit card processing and many more.
Thus , Indian IT and ITeS industry is divided into four major segments –
Globalisation has had a profound impact in shaping the Indian IT industry with India capturing a sizeable chunk of the global market for technology sourcing and business services.
The information technology (IT) and information technology enabled services (ITeS) industry has been one of the key driving forces fuelling India's economic growth.
It has employed almost 10 million Indians and hence, has contributed a lot to social transformation in the country.
IT is seen as a change enabler and a source of business value for organizations as Indian firms, across all other sectors, largely depend on the IT & ITeS service providers to make their business processes efficient and streamlined.
IT-business process outsourcing (BPO) sector, including the domestic and exports segments are growing strongly, both onshore as well offshore. The companies continue to move up the value-chain to offer higher end research and analytics services to their clients.
Over the years, the growth drivers for this sector have been the verticals of manufacturing, telecommunication, insurance, banking, finance and, of late, the fledgling retail revolution.
The growth in the Indian IT industry is expected to be around 30 per cent and the overall sales are projected to touch US$ 17 billion in FY 15, according to Manufacturers' Association of Information Technology (MAIT).
The Indian IT infrastructure market - comprising server, storage and networking equipment - is expected to grow by four per cent in 2014 to touch US$ 1.9 billion, according to Gartner.
The IT services market in India is expected to grow at the rate of 8.4 per cent in 2014 to Rs 476,356 million (US$ 7.88 billion), according to International Data Corporation (IDC).
As the new scenario unfolds, it is getting clear that the future growth of IT and ITeS will be fuelled by the verticals of climate change, mobile applications, healthcare, energy efficiency and sustainable energy. Traditional business strongholds will make way for new geographies, there would be new customers and more and more of SMEs will go for IT application and services . Innovation needs to be done in three areas that are connected to the information technology industry of India such as business models, ecosystems and knowledge. The Indian information technology industry also needs to co-ordinate with the academic circles as well as other industries in India for better performance and improved productivity..
India is the most preferred location for engineering offshoring; according to a customer poll conducted by Booz and Co. Companies are now off shoring complete product responsibility. Increased focus on R&D by IT firms in India has resulted in rising number of patents filed by them. India’s IT sector is gradually moving from linear models (rising headcount to increase revenue) to non-linear ones. In line with this, IT companies in the country are focusing on new models such as platform-based BPM services and creation of intellectual property.
Tier II and III cities are increasingly gaining traction among IT companies aiming to establish business in India. Cheap labour, affordable real estate, favourable government regulations, tax breaks and special economic zone (SEZ) schemes are facilitating their emergence as new IT destinations.
Indian insurance companies also plan to spend Rs 12,100 crore (US$ 2.01 billion) on IT products and services in 2014, a 12 per cent rise over 2013, according to Gartner. This forecast includes spending by insurers on internal IT (including personnel), software, hardware, external IT services and telecommunications.
India is a preferred destination for companies looking to offshore their IT and back-office functions. It also retains its low-cost advantage and is a financially attractive location when viewed in combination with the business environment it offers and the availability of skilled people.
Between April 2000 and March 2010, the computer software and hardware sector received cumulative foreign direct investment (FDI) of US$ 9,872.49 million, according to the Department of Industrial Policy and Promotion.
The government has constituted the Technical Advisory Group for Unique Projects (TAGUP) under the chairmanship of Nandan Nilekani. The Group would develop IT infrastructure in five key areas, which includes the New Pension System (NPS) and the Goods and Services Tax (GST)
The government set up the National Taskforce on Information Technology and Software Development with the objective of framing a long term National IT Policy for the country
Enactment of the Information Technology Act, which provides a legal framework to facilitate electronic commerce and electronic transactions
Setting up of Software Technology Parks of India (STPIs) in 1991 for the promotion of software exports from the country, there are currently 51 STPI centres where apart from exemption from customs duty available for capital goods there are also exemptions from service tax, excise duty, and rebate for payment of Central Sales Tax. But the most important incentive available is 100 per cent exemption from Income Tax of export profits, which has been extended till 31st March 2011
Government is also setting up Information Technology Investment Regions (ITIRs). These regions would be endowed with excellent infrastructure and would reap the benefits of co-siting, networking and greater efficiency through use of common infrastructure and support services
Moreover, according to NASSCOM government, IT spend was US$ 3.2 billion in 2009 and is expected to reach US$ 5.4 billion by 2011. Further, according to NASSCOM, there is US$ 9 billion business opportunity in e-governance in India.
The e-Governance initiative of Govt. of India is totally based on ICT which has the potential towards citizen centric services. The application of e-governance in health care can monitor and improve the quality of health care services, make the system efficient, transparent and cost effective as it will bring healthcare providers, policy makers, professionals and the public on a common platform.
Net neutrality implies equal access to all websites for all. Any priority given to an application or company on payment basis is seen as violating the concept
The Telecom Regulatory Authority of India has put out a paper on how it plans to change the users’ relationship with Internet, at the bidding of telecom companies. At present, once a user has paid for say 2 GB of surfing, it does not matter how he or she uses it, whether to watch YouTube videos, communicate through Skype, or just read. That's net neutrality. In other words, the telcos are neutral or impartial about the way the traffic flows on the Internet. All this could change. The users may have to pay for specific services. So, if one provider offers free access to Facebook, it could charge for maybe Twitter, or there could be higher rates for Skype. Users will also lose freedom to go anywhere online. Some services will have speedier access, others slower; some will be free, others expensive. Internet will be sliced rather than offered as a whole. The user will be the loser.
Telcos have a point that while they built the network infrastructure, the profits are going to apps. That services like WhatsApp benefit at their cost. The counter is stronger. If telcos invested crores in infrastructure, they also earned lots off it. But innovation rules and always has — that’s the fascinating part about the open platform that Internet is. The freedom it provides to come up with anything new, the spirit of enterprise it accords. Anyone with an idea has a chance. Not anyone with money to pay telcos for allowing free access and score in the absence of competition. It is net neutrality that allows the online seller and the buyer, the giver and the taker to benefit.
Telcos built the infrastructure to sell access to it, not to control what the consumer does on it. Once net neutrality, which is a regulation without a legal framework, is breached, the Internet users' interest would be sold out. Imagine startups failing to put up an online show because the money to pay telecom companies didn't add up.
By: Gurvinder Kour ProfileResourcesReport error
Access to prime resources
New Courses