THE COMPUTING INDUSTRY AND ROLES WITHIN.
HISTORY OF THE COMPUTING INDUSTRY.
The history of the computing industry extends its roots from beyond the history of computing hardware from the early 19th century to the modern computing technology marked by the key innovations such as the transistor , integrated circuit and the internet.
ANALYSIS OF MAJOR MILESTONES IN THE HISTORY OF COMPUTING.
.1820 – 1940
Mechanical calculators to the first digital computers.
-Between the 1820s to the 1840s a mechanical engineer named Charles Babbage designed the first mechanical computers that could perform arithmetical calculations and with key computer features such as a arithmetic logic unit , a memory unit with the use of conditional branching .Along side a mathematician Ada Lovelace wrote the first algorithm that was intented to be carried out by a computer she became the first computer programmer .
BIRTH OF ELECTRONIC DIGITAL COMPUTING.
-In the year 1943 an engineer Tommy flower and other British engineers designed the Colossus computer during world war ll to decode German codes and the decipher encrypted transmission from the German teleprinter Lorenz Cipher.
In 1945 the ENIAC (ELECTRONIC NUMERICAL INTEGRATOR AND COMPUTER) the first programmable electronic digital
computer was built during world war ll by the United States.
TRANSISTORS & INTEGRATED CIRCUITS.
In 1947 John Bardeen , Walter Brattain and William Shockley at Bell labs created the first transistor that could amplify or switch
electronic signals which lead miniaturization and mass production of electronics.
In 1956 the first operating system was introduced , a batch processing system GM-NAA I/O that auotumated job handling.
In 1958 Jack Kilby at Texas Instruments and Robert Noyce at Fairchild Semiconductor invented the integrated circuit (IC) which is
modernly known as the backbone of modern electronics as it is used in mobile phones , cars , medical devices and computer
systems.
PERSONAL COMPUTING AND NETWORKING REVOLUTION.
In 1969 ARPANET (ADVANCED RESEARCH PROJECTS AGENCY NETWORK) inter network connector that allowed remote
computers to communicate with each other.
In the year 1971 the INTEL 4004 was the first commercially available microprocessor a single -chip CPU that had the same
computing power as the room sized ENIAC.
In 1971 the Altair 8800 was the first microcomputer that was the commercially successful computer and Unix also
revolutionized OS design with simplicity , portability and multitasking.
In 1974 personal computers emerged leading to a simpler Oss like CP/M.
In the 1980s Apple and IBM commercially released personal computers .
In 1990 Tim Berners –Lee wrote the first website and the World Wide Web was introduced to the public transforming
the network into a global information system.
QUANTUM COMPUTING.
In 1980 Richard Feynman and Paul Benioff created the theoretical foundation that introduced the concept of
quantum bits and qubits which can exist in multiple states at the same time due to superposition and entanglement.
In the 1980s a OS like Novell Netware and Windows NT were developed to manage network communication which
provided a user’s facility to work in a collaborative environment ;with remote access and file sharing
In 1994 Peter Shor developed the Shor’s algorithm which highlights the potential for quantum computing to break
modern cryptographic systems.
In the 2000s they was the invention of smartphones with mobile operating system some include iOS and Android.
In 2010 operating systems integrated features of AI technology such as Siri ; Google assistant and Alexa with voice
command ,predictive text and personalized recommendations.
In 2019 Google achieved the quantum supremacy by using a 53-qubit machine to solve mathematical problems in
minutes .
Artificial intelligence.
The development of ai from the 2010its has grown more and more , with it being part of everyday life with features such as facial recognition to access bank apps , unlock your phone and to authorize payments and .Use of ai chatbots INCUSTOMER SERVICE,SEKF DRIVING AND SEMI AUTOMATED VEHICLES LIKE THE tesla that uses a ai algorthim to make driving decisions.
Trends and a opinion of where the industry is heading.
Machine learning
machine learning had made some major advancement throughout the years with it becoming more accessible,sophisticated and ethical .
the key trends in machine learning
1.Advancement in generative ai and multimodality.
With a growth of the creating visual and written content .
2.Autonomous ai agents.
With the power to ai systems to perform complex tasks with minimum human interaction.
3.A enhancement in al in cybersecurity .
to easily detect cyber threat s and to identify unsual paaterns and to strategies attacks.
WHERE IS MACHINE LEARNING HEADING.
The future of machine learning is heading to a path that will create systems that will have a real –world impact with an advancement in robotics
,autonomous vehicle and scientific recovery that will decrease human labour ;a integration and democratization which will allow non experts to build
and deploy machine learning solutions ; hardware innovation which will create a advancement in Al chips ;edge ai and federation learning which will
process data locally and train models across decentralized devices ; multimodal al which will allow for a seamless processing and intergartion of various
data types.
Info-Graphic of the basic structure in a web development company and a digital/social media platform.
Info-graphics on key career pathways and three job roles in each.
SEE THE TASK THREE MARKETING RESEARCH ON A SEPERATE POWERPOINT DOCUMENT NAMED PROJECT B DIGIATAL INDUSTRY ROLES.
Comments
Post a Comment