• EssayBasics.com
  • Pay For Essay
  • Write My Essay
  • Homework Writing Help
  • Essay Editing Service
  • Thesis Writing Help
  • Write My College Essay
  • Do My Essay
  • Term Paper Writing Service
  • Coursework Writing Service
  • Write My Research Paper
  • Assignment Writing Help
  • Essay Writing Help
  • Call Now! (USA) Login Order now
  • EssayBasics.com Call Now! (USA) Order now
  • Writing Guides

Technology In 21st Century (Essay Sample) 2023

Technology in 21st century.

Modern technology is an important area that business needs to consider. The reason behind this is that technological advancements are pivotal in enhancing business operations around the globe. Most businesses thrive using modern technology as technology has advanced operations of business through several ways. Some of the ways technology have enhanced business operations are efficient marketing through social media platforms, effective mass communication to all personnel in the business and provisions of effectual ways business people use to store and access data for the functioning of the business. Besides, social media is an influential social media aspect that has huge followership, and effective use of social media is an index to a business in increasing sales volume among many businesses. The paper views technology in the 21st century.

Technology has played a crucial role towards enhancement of globalization in the 21st century. Globalization had huge impacts on the economic world, through an array of merits and demerits arising from globalization acts. New technological trends have played a fundamental role in making a rapid enhancement to globalization. Additionally, people connect and communicate to other people from areas that are very far geographically, for example, people who have lived in one country or one continent have pertinent information concerning other far areas such as they communicate to people and find more information about other important continents and aspects such as business from other far geographies. Rapid globalization has also enhanced economic development from a business-based perspective.

Modern technology stimulates most of the business activities around the world. The reason behind the argument is that most of the businesses in the 21st century make extensive use modern technology to conduct business. From a different angle, most people transact businesses even from a far distance through technology, which has given the world an outlook of a global, interactive society where people cold share and access new ideas and vital information. People exchange business ideas and transact business activities from far distances, which has helped disqualify distance as a barrier in business, and most of the business have taken advantage of globalization and modern trends in technology to enhance their business operations.

Information Technology (IT) and the Internet are protuberant technological trends in the 21st century. The chief reason behind the argument is that most businesses have adapted Information technology in their operations for effectiveness. Information technology has had an array of impacts to businesses around the world. A good example of the impacts that Information technology has is the impact and aspect of competition. Firms and business around the world use Internet and Information technology factor to outdo the other firms that provide similar related services. Organizations have learned on Information technology as a chief aspect in interviews and evaluations, as people with (IT) competence are huge assets for most business around the world.

In the 21st century, technology has evolved and became an inevitable aspect around the globe. Scholars have proved that most people can carry out projects and make business plans that consume information technology competencies and services extensively. A greater percentage of the plans that business people make for example marketing and management plans incorporate IT experts as advisors, who give the proprietors the best techniques to apply. From a different view, Information technology is a field that involves uniformity and accuracy of undertakings based on IT. In a case where one uses modern technology to market or manage the business operations, the aspect uniformity, accurate targets and effective marketing strategies have been protuberant outcomes aligned to good use technological trends in businesses.

modern technology in the 21st century essay

Logo

Essay on Modern Technology

Students are often asked to write an essay on Modern Technology in their schools and colleges. And if you’re also looking for the same, we have created 100-word, 250-word, and 500-word essays on the topic.

Let’s take a look…

100 Words Essay on Modern Technology

Introduction to modern technology.

Modern technology refers to the recent advancements and innovations that have made our lives easier. It includes computers, smartphones, the internet, and many more.

Benefits of Modern Technology

Modern technology has numerous benefits. It helps us communicate with people worldwide, provides information at our fingertips, and makes learning fun and interactive.

Challenges of Modern Technology

Despite the benefits, modern technology also poses some challenges. It can lead to addiction and loss of privacy. It’s crucial to use technology wisely to avoid these issues.

In conclusion, modern technology has changed our lives significantly. It’s our responsibility to use it responsibly and reap its benefits.

250 Words Essay on Modern Technology

The advent of modern technology.

Modern technology, an offshoot of the ceaseless human quest for innovation, has become an integral part of our lives. It has not only revolutionized communication and information dissemination but also transformed the way we live, work, and play.

Impact on Communication and Information

The advent of the Internet and smartphones has democratized information, making it accessible to everyone, everywhere. Social media platforms have given a voice to the voiceless, enabling a global dialogue that transcends geographical boundaries. Additionally, the emergence of artificial intelligence and machine learning has opened up new frontiers in data analysis and decision-making processes.

Transforming Daily Life

Modern technology has also significantly altered our daily routines. Smart homes, equipped with automated devices, have enhanced comfort and convenience. Wearable technology monitors our health, encouraging proactive wellness. Furthermore, e-commerce platforms and digital payment systems have streamlined shopping and financial transactions.

Work and Play in the Digital Age

In the workspace, technology has automated repetitive tasks, freeing up time for creative and strategic thinking. Remote working, made possible by digital tools, has blurred the lines between office and home. Meanwhile, in the realm of entertainment, virtual and augmented reality technologies have redefined our concept of play, immersing us in interactive digital worlds.

The Double-edged Sword

However, this technological revolution is a double-edged sword. While it brings countless benefits, it also presents challenges such as privacy concerns, cybercrime, and digital addiction. It is, therefore, crucial to navigate this digital landscape with caution, leveraging its advantages while mitigating its potential risks.

500 Words Essay on Modern Technology

In the contemporary era, modern technology has emerged as a significant facet of human life. It has revolutionized the way we communicate, learn, work, and entertain ourselves. The rapid evolution of technology, from the advent of the internet to the development of artificial intelligence, has had profound implications on society, economy, and culture.

The Impact of Modern Technology on Communication

Modern technology has drastically transformed the realm of communication. The rise of social media platforms and instant messaging apps has made it possible to connect with people across the globe in real time. Emails and video conferences have replaced traditional letters and face-to-face meetings, making communication faster and more efficient. However, this digital revolution has also raised concerns about privacy and the authenticity of information disseminated online.

Modern Technology in Education

Modern technology in the workplace, modern technology and entertainment.

In the realm of entertainment, modern technology has given rise to new forms of media and has changed the way we consume content. Streaming platforms have challenged traditional television, and online gaming has become a global phenomenon. While these advancements have democratized entertainment, they have also raised questions about digital addiction and mental health.

Conclusion: The Future of Modern Technology

If you’re looking for more, here are essays on other interesting topics:

Apart from these, you can look at all the essays by clicking here .

Leave a Reply Cancel reply

21 Most Important Inventions of the 21st Century

modern technology in the 21st century essay

The human race has always innovated, and in a relatively short time went from building fires and making stone-tipped arrows to creating smartphone apps and autonomous robots. Today, technological progress will undoubtedly continue to change the way we work, live, and survive in the coming decades.

Since the beginning of the new millennium, the world has witnessed the emergence of social media, smartphones, self-driving cars, and autonomous flying vehicles. There have also been huge leaps in energy storage, artificial intelligence, and medical science. Men and women have mapped the human genome and are grappling with the ramifications of biotechnology and gene editing. 

We are facing immense challenges in global warming and food security, among many other issues. While human innovation has contributed to many of the problems we are facing, it is also human innovation and ingenuity that can help humanity deal with these issues. These are 21 strategies that could avert climate disaster . 

24/7 Wall St. examined media reports and other sources on the latest far-reaching innovations to find some of the most important 21st-century inventions. In some cases, though there were some precursor research and ancillary technologies before 2001, the innovation did not become available to the public until this century. This list focuses on innovations (such as touch screen glass) that support products rather than the specific products themselves (like the iPhone). 

It remains to be seen if all the technology on this list will continue to have an impact throughout the century. Legislation in the United States may limit the longevity of e-cigarettes, for example. But some of the inventions of the last 20 years will likely have staying power for the foreseeable future. Here are some inventions that are hundreds of years old but are still widely used today .

Click here to see the 21 most important inventions of the 21st century

modern technology in the 21st century essay

1. 3D printing

Most inventions come as a result of previous ideas and concepts, and 3D printing is no different. The earliest application of the layering method used by today’s 3D printers took place in the manufacture of topographical maps in the late 19th century, and 3D printing as we know it began in 1980.

The convergence of cheaper manufacturing methods and open-source software, however, has led to a revolution of 3D printing in recent years. Today, the technology is being used in the production of everything from lower-cost car parts to bridges to less painful ballet slippers and it is even considered for artificial organs.

[in-text-ad]

modern technology in the 21st century essay

2. E-cigarettes

While components of the technology have existed for decades, the first modern e-cigarette was introduced in 2006. Since then, the devices have become wildly popular as an alternative to traditional cigarettes, and new trends, such as the use of flavored juice, have contributed to the success of companies like Juul.

Recent studies have shown that there remains a great deal of uncertainty and risk surrounding the devices, with an increasing number of deaths and injuries linked to vaping. In early 2020, the FDA issued a widespread ban on many flavors of cartridge-based e-cigarettes, in part because those flavors are especially popular with children and younger adults.

modern technology in the 21st century essay

3. Augmented reality

Augmented reality, in which digital graphics are overlaid onto live footage to convey information in real time, has been around for a while. Only recently, however, following the arrival of more powerful computing hardware and the creation of an open source video tracking software library known as ARToolKit that the technology has really taken off.

Smartphone apps like the Pokémon Go game and Snapchat filters are just two small popular examples of modern augmented reality applications. The technology is being adopted as a tool in manufacturing, health care, travel, fashion, and education.

modern technology in the 21st century essay

4. Birth control patch

The early years of the millennia have brought about an innovation in family planning, albeit one that is still focused only on women and does nothing to protect against sexually transmitted infections. Still, the birth control patch was first released in the United States in 2002 and has made it much easier for women to prevent unintended pregnancies. The plastic patch contains the same estrogen and progesterone hormones found in birth control pills and delivers them in the same manner as nicotine patches do to help people quit tobacco products.

[in-text-ad-2]

modern technology in the 21st century essay

5. Blockchain

You’ve likely heard about it even if you don’t fully understand it. The simplest explanation of blockchain is that it is an incorruptible way to record transactions between parties — a shared digital ledger that parties can only add to and that is transparent to all members of a peer-to-peer network where the blockchain is logged and stored.

The technology was first deployed in 2008 to create Bitcoin, the first decentralized cryptocurrency, but it has since been adopted by the financial sector and other industries for myriad uses, including money transfers, supply chain monitoring, and food safety.

modern technology in the 21st century essay

6. Capsule endoscopy

Advancements in light emitting electrodes, image sensors, and optical design in the ’90s led to the emergence of capsule endoscopy, first used in patients in 2001. The technology uses a tiny wireless camera the size of a vitamin pill that the patient swallows. As the capsule traverses the digestive system, doctors can examine the gastrointestinal system in a far less intrusive manner. Capsule endoscopy can be used to identify the source of internal bleeding, inflammations of the bowel ulcers, and cancerous tumors.

modern technology in the 21st century essay

7. Modern artificial pancreas

More formally known as closed-loop insulin delivery system, the artificial pancreas has been around since the late ’70s, but the first versions were the size of a filing cabinet. In recent years, the artificial pancreas, used primarily to treat type 1 diabetes, became portable. The first artificial pancreas (the modern, portable kind) was approved for use in the United States in 2016.

The system continuously monitors blood glucose levels, calculates the amount of insulin required, and automatically delivers it through a small pump. British studies have shown that patients using these devices spent more time in their ideal glucose-level range. In December 2019, the FDA approved an even more advanced version of the artificial pancreas, called Control-IQ, developed by UVA.

modern technology in the 21st century essay

8. E-readers

Sony was the first company to release an e-reader using a so-called microencapsulated electrophoretic display, commonly referred to as e-ink. E-ink technology, which mimics ink on paper that is easy on the eyes and consumes less power, had been around since the ’70s (and improved in the ’90s), but the innovation of e-readers had to wait until after the broader demand for e-books emerged. Sony was quickly overtaken by Amazon’s Kindle after its 2007 debut. The popularity of e-readers has declined with the emergence of tablets and smartphones, but they still command loyalty from bookworms worldwide.

modern technology in the 21st century essay

9. Gene editing

Researchers from the University of California, Berkeley and a separate team from Harvard and the Broad Institute independently discovered in 2012 that a bacterial immune system known as CRISPR (an acronym for clustered regularly interspaced short palindromic repeats) could be used as a powerful gene-editing tool to make detailed changes to any organism’s DNA. This discovery heralded a new era in biotechnology.

The discovery has the potential to eradicate diseases – for example by altering the genes in mice and mosquitoes to combat the spread of Lyme disease and malaria – but is also raising ethical questions, especially with regards to human gene editing such as for reproductive purposes.

modern technology in the 21st century essay

10. High-density battery packs

Tesla electric cars have received so much attention largely because of their batteries. The batteries, located underneath the passenger cabin, consist of thousands of high-density lithium ion cells, each barely larger than a standard AA battery, nestled into a large, heavy battery pack that also offers Tesla electric cars a road-gripping low center of gravity and structural support.

The brainchild of Tesla co-founder J.B. Straubel, these battery modules pack more of a punch than standard (and cheaper) electric car batteries. These packs are also being used in residential, commercial, and grid-scale energy storage devices.

alexa by Stock Catalog

11. Digital assistants

One of the biggest technology trends in recent years has been smart home technology, which can now be found in everyday consumer devices like door locks, light bulbs, and kitchen appliances. The key piece of technology that has helped make all this possible is the digital assistant. Apple was the first major tech company to introduce a virtual assistant called Siri, in 2011, for iOS.

Other digital assistants, such as Microsoft’s Cortana and Amazon’s Alexa, have since entered the market. The assistants gained another level of popularity when tech companies introduced smart speakers. Notably, Google Home and Amazon’s Echo can now be found in millions of homes, with an ever-growing range of applications.

modern technology in the 21st century essay

12. Robot heart

Artificial hearts have been around for some time. They are mechanical devices connected to the actual heart or implanted in the chest to assist or substitute a heart that is failing. Abiomed, a Danvers, Massachusetts-based company, developed a robot heart called AbioCor, a self-contained apparatus made of plastic and titanium. AbioCor is a self-contained unit with the exception of a wireless battery pack that is attached to the wrist. Robert Tools, a technical librarian with congestive heart failure, received the first one on July 2, 2001.

modern technology in the 21st century essay

13. Retinal implant

When he was a medical student, Dr. Mark Humayun watched his grandmother gradually lose her vision. The ophthalmologist and bioengineer focused on finding a solution to what causes blindness. He collaborated with Dr. James Weiland, a colleague at the USC Gayle and Edward Roski Eye Institute, and other experts to create the Argus II.

The Argus II is a retinal prosthesis device that is considered to be a breakthrough for those suffering from retinitis pigmentosa, an inherited retinal degenerative condition that can lead to blindness. The condition afflicts 1.5 million people worldwide. The device was approved by the U.S. Food and Drug Administration in 2013.

modern technology in the 21st century essay

14. Mobile operating systems

Mobile operating systems for smartphones and other portable gadgets have enabled the proliferation of smartphones and other mobile gadgets thanks to their intuitive user interfaces and seemingly endless app options. Mobile operating systems have become the most consumer-facing of computer operating systems. When Google first purchased Android Inc. in 2005, the operating system was just two years old, and the first iPhone (with its iOS) was still two years from its commercial debut.

modern technology in the 21st century essay

15. Multi-use rockets

Billionaire entrepreneur Elon Musk may not necessarily be remembered for his contributions to electric cars innovations, but rather for his contributions to space exploration. Musk’s private space exploration company, SpaceX, has developed rockets that can be recovered and reused in other launches — a more efficient and cheaper alternative to the method of using the rockets only once and letting them fall into the ocean.

On March 30, 2017, SpaceX became the first to deploy one of these used rockets, the Falcon 9. Blue Origin, a space-transport company founded by Amazon.com’s Jeff Bezos, has launched its own reusable rocket.

modern technology in the 21st century essay

16. Online streaming

Online streaming would not be possible without the convergence of widespread broadband internet access and cloud computing data centers used to store content and direct web traffic. While internet-based live streaming has been around almost since the internet was broadly adopted in the ’90s, it was not until the mid-2000s that the internet could handle the delivery of streaming media to large audiences. Online streaming is posing an existential threat to existing models of delivering media entertainment, such as cable television and movie theaters.

modern technology in the 21st century essay

17. Robotic exoskeletons

Ever since researchers at the University of California, Berkeley, created in 2003 a robotic device that attaches to the lower back to augment strength in humans, the demand for robotic exoskeletons for physical rehabilitation has increased, and manufacturing has taken off. Wearable exoskeletons are increasingly helping people with mobility issues (particularly lower body paralysis), and are being used in factories. Ford Motor Company, for example, has used an exoskeleton vest that helps auto assemblers with repetitive tasks in order to lessen the wear and tear on shoulders and arms.

modern technology in the 21st century essay

18. Small satellites

As modern electronics devices have gotten smaller, so, too, have orbital satellites, which companies, governments, and organizations use to gather scientific data, collect images of Earth, and for telecommunications and intelligence purposes. These tiny, low-cost orbital devices fall into different categories by weight, but one of the most common is the shoebox-sized CubeSat. As of October 2019, over 2,400 satellites weighing between 1 kg (2.2 lbs) and 40 kgs (88 lbs) have been launched, according to Nanosats Database.

modern technology in the 21st century essay

19. Solid-state lidar

Lidar is an acronym that stands for light detection and ranging, and is also a portmanteau of the words “light” and “radar.” The technology today is most often used in self-driving cars. Like radars, which use radio waves to bounce off objects and determine their distance, lidar uses a laser pulse to do the same.

By sending enough lasers in rotation, it can create a constantly updated high-resolution image map of the surrounding environment. The next steps in the technology would include smaller and cheaper lidar sensors, and especially solid state ones — no spinning tops on the cars.

modern technology in the 21st century essay

20. Tokenization

If you have ever used the chip embedded in a credit or debit card to make a payment by tapping rather than swiping, then you have benefited from the heightened security of tokenization. This data security technology replaces sensitive data with an equivalent randomized number – known as a token – that is used only once per transaction and has no value to would-be hackers and identity thieves attempting to intercept transaction data as it travels from sender to recipient. Social media site classmates.com was reportedly the first to use tokenization in 2001 to protect its subscribers’ sensitive data. Tokenization is also being touted as a way to prevent hackers from interfering with driverless cars.

modern technology in the 21st century essay

21. Touchscreen glass

Super-thin, chemically strengthened glass is a key component of the touchscreen world. This sturdy, transparent material is what helps keep your iPad or Samsung smartphone from shattering into pieces at the slightest drop. Even if these screens crack, in most cases the damage is cosmetic and the gadget still works. Corning Inc., already a leader in the production of treated glass used in automobiles, was asked by Apple to develop 1.3-mm treated glass for its iPhone, which debuted in 2007. Corning’s Gorilla Glass is still the most well known, though other brands exist in the marketplace.

Take This Retirement Quiz To Get Matched With An Advisor Now (Sponsored)

Are you ready for retirement? Planning for retirement can be overwhelming, that’s why it could be a good idea to speak to a fiduciary financial advisor about your goals today.

Start by taking this retirement quiz right here from SmartAsset that will match you with up to 3 financial advisors that serve your area and beyond in 5 minutes. Smart Asset is now matching over 50,000 people a month.

Click here now to get started.

Thank you for reading! Have some feedback for us? Contact the 24/7 Wall St. editorial team .

Latest from 24/7

Become an Insider

Sign up today to receive premium content.

Home

The Evolution of Technology in K–12 Classrooms: 1659 to Today

Bio Photo of Alexander Huls

Alexander Huls is a Toronto-based writer whose work has appeared in  The New York Times ,  Popular Mechanics ,  Esquire ,  The Atlantic  and elsewhere.

In the 21st century, it can feel like advanced technology is changing the K–12 classroom in ways we’ve never seen before. But the truth is, technology and education have a long history of evolving together to dramatically change how students learn.

With more innovations surely headed our way, why not look back at how we got to where we are today, while looking forward to how educators can continue to integrate new technologies into their learning?

DISCOVER:  Special education departments explore advanced tech in their classrooms.

Using Technology in the K–12 Classroom: A History

1659: magic lantern.

  • Inventor:  Christiaan Huygens
  • A Brief History:  An ancestor of the slide projector, the magic lantern projected glass slides with light from oil lamps or candles. In the 1680s, the technology was brought to the education space to show detailed anatomical illustrations, which were difficult to sketch on a chalkboard.
  • Interesting Fact:  Huygens initially regretted his creation, thinking it was too frivolous.

1795: Pencil

  • Inventor:  Nicolas-Jacques Conté
  • A Brief History : Versions of the pencil can be traced back hundreds of years, but what’s considered the modern pencil is credited to Conté, a scientist in Napoleon Bonaparte’s army. It made its impact on the classroom, however, when it began to be mass produced in the 1900s.
  • Interesting Fact:  The Aztecs used a form of graphite pencil in the 13th century.

1801: Chalkboard

  • Inventor:  James Pillans
  • A Brief History:  Pillans — a headmaster at a high school in Edinburgh, Scotland — created the first front-of-class chalkboard, or “blackboard,” to better teach his students geography with large maps. Prior to his creation, educators worked with students on smaller, individual pieces of wood or slate. In the 1960s, the creation was upgraded to a green board, which became a familiar fixture in every classroom.
  • Interesting Fact:  Before chalkboards were commercially manufactured, some were made do-it-yourself-style with ingredients like pine board, egg whites and charred potatoes.

1888: Ballpoint Pen

  • Inventory:  John L. Loud
  • A Brief History:  John L. Loud invented and patented the first ballpoint pen after seeking to create a tool that could write on leather. It was not a commercial success. Fifty years later, following the lapse of Loud’s patent, Hungarian journalist László Bíró invented a pen with a quick-drying special ink that wouldn’t smear thanks to a rolling ball in its nib.
  • Interesting Fact:  When ballpoint pens debuted in the U.S., they were so popular that Gimbels, the department store selling them, made $81 million in today’s money within six months.

LEARN MORE:  Logitech Pen works with Chromebooks to combine digital and physical learning.

1950s: Overhead Projector

  • Inventor:  Roger Appeldorn
  • A Brief History:  Overhead projects were used during World War II for mission briefings. However, 3M employee Appeldorn is credited with creating not only a projectable transparent film, but also the overhead projectors that would find a home in classrooms for decades.
  • Interesting Fact:  Appeldorn’s creation is the predecessor to today’s  bright and efficient laser projectors .

1959: Photocopier

  • Inventor:  Chester Carlson
  • A Brief History:  Because of his arthritis, patent attorney and inventor Carlson wanted to create a less painful alternative to making carbon copies. Between 1938 and 1947, working with The Haloid Photographic Company, Carlson perfected the process of electrophotography, which led to development of the first photocopy machines.
  • Interesting Fact:  Haloid and Carlson named their photocopying process xerography, which means “dry writing” in Greek. Eventually, Haloid renamed its company (and its flagship product line) Xerox .

1967: Handheld Calculator

  • Inventor:   Texas Instruments
  • A Brief History:  As recounted in our  history of the calculator , Texas Instruments made calculators portable with a device that weighed 45 ounces and featured a small keyboard with 18 keys and a visual display of 12 decimal digits.
  • Interesting Fact:  The original 1967 prototype of the device can be found in the Smithsonian Institution’s  National Museum of American History .

1981: The Osborne 1 Laptop

  • Inventor:  Adam Osborne, Lee Felsenstein
  • A Brief History:  Osborne, a computer book author, teamed up with computer engineer Felsenstein to create a portable computer that would appeal to general consumers. In the process, they provided the technological foundation that made modern one-to-one devices — like Chromebooks — a classroom staple.
  • Interesting Fact:  At 24.5 pounds, the Osborne 1 was about as big and heavy as a sewing machine, earning it the current classification of a “luggable” computer, rather than a laptop.

1990: World Wide Web

  • Inventor:  Tim Berners-Lee
  • A Brief History:  In the late 1980s, British scientist Berners-Lee created the World Wide Web to enable information sharing between scientists and academics. It wasn’t long before the Web could connect anyone, anywhere to a wealth of information, and it was soon on its way to powering the modern classroom.
  • Interesting Fact:  The first web server Berners-Lee created was so new, he had to put a sign on the computer that read, “This machine is a server. DO NOT POWER IT DOWN!”

Click the banner  to access customized K–12 technology content when you sign up as an Insider.

K-12 Insider Mobile Devices

What Technology Is Used in Today’s K–12 Classrooms?

Technology has come so far that modern classrooms are more technologically advanced than many science labs were two decades ago. Students have access to digital textbooks,  personal devices , collaborative  cloud-based tools , and  interactive whiteboards . Emerging technologies now being introduced to K–12 classrooms include voice assistants, virtual reality devices and 3D printers.

Perhaps the most important thing about ed tech in K–12 isn’t what the technology is, but how it’s used.

How to Integrate Technology into K–12 Classrooms

The first step to integrating technology into the K–12 classroom is  figuring out which solution to integrate , given the large variety of tools available to educators. That variety comes with benefits — like the ability to align tech with district objectives and grade level — but also brings challenges.

“It’s difficult to know how to choose the appropriate digital tool or resource,” says Judi Harris, professor and Pavey Family Chair in Educational Technology at the William & Mary School of Education. “Teachers need some familiarity with the tools so that they understand the potential advantages and disadvantages.”

Dr. Judi Harris

Judi Harris Professor and Pavey Family Chair in Educational Technology, William and Mary School of Education

K–12 IT leaders should also be careful not to focus too much on technology implementation at the expense of curriculum-based learning needs. “What districts need to ask themselves is not only whether they’re going to adopt a technology, but how they’re going to adopt it,” says Royce Kimmons, associate professor of instructional psychology and technology at Brigham Young University.

In other words, while emerging technologies may be exciting, acquiring them without proper consideration of their role in improving classroom learning will likely result in mixed student outcomes. For effective integration, educators should ask themselves, in what ways would the tech increase or support a student’s productivity and learning outcomes? How will it improve engagement?

Integrating ed tech also requires some practical know-how. “Teachers need to be comfortable and confident with the tools they ask students to use,” says Harris.

Professional development for new technologies is crucial, as are supportive IT teams, tech providers with generous onboarding programs and technology integration specialists. Harris also points to initiatives like YES: Youth and Educators Succeeding, a nonprofit organization that prepares students to act as resident experts and classroom IT support.

KEEP READING:  What is the continued importance of professional development in K–12 education?

But as educational technology is rolled out and integrated, it’s important to keep academic goals in sight. “We should never stop focusing on how to best understand and help the learner to achieve those learning objectives,” says Harris.

That should continue to be the case as the technology timeline unfolds, something Harris has witnessed firsthand during her four decades in the field. “It’s been an incredible thing to watch and to participate in,” she notes. “The great majority of teachers are extremely eager to learn and to do anything that will help their students learn better.”

modern technology in the 21st century essay

  • Professional Development

Related Articles

VIVE Focus 3 Headset

Freshen Up Your Customer Experience Strategy

Copyright © 2024 CDW LLC 200 N. Milwaukee Avenue , Vernon Hills, IL 60061 Do Not Sell My Personal Information

How artificial intelligence is transforming the world

Subscribe to the center for technology innovation newsletter, darrell m. west and darrell m. west senior fellow - center for technology innovation , douglas dillon chair in governmental studies john r. allen john r. allen.

April 24, 2018

Artificial intelligence (AI) is a wide-ranging tool that enables people to rethink how we integrate information, analyze data, and use the resulting insights to improve decision making—and already it is transforming every walk of life. In this report, Darrell West and John Allen discuss AI’s application across a variety of sectors, address issues in its development, and offer recommendations for getting the most out of AI while still protecting important human values.

Table of Contents I. Qualities of artificial intelligence II. Applications in diverse sectors III. Policy, regulatory, and ethical issues IV. Recommendations V. Conclusion

  • 49 min read

Most people are not very familiar with the concept of artificial intelligence (AI). As an illustration, when 1,500 senior business leaders in the United States in 2017 were asked about AI, only 17 percent said they were familiar with it. 1 A number of them were not sure what it was or how it would affect their particular companies. They understood there was considerable potential for altering business processes, but were not clear how AI could be deployed within their own organizations.

Despite its widespread lack of familiarity, AI is a technology that is transforming every walk of life. It is a wide-ranging tool that enables people to rethink how we integrate information, analyze data, and use the resulting insights to improve decisionmaking. Our hope through this comprehensive overview is to explain AI to an audience of policymakers, opinion leaders, and interested observers, and demonstrate how AI already is altering the world and raising important questions for society, the economy, and governance.

In this paper, we discuss novel applications in finance, national security, health care, criminal justice, transportation, and smart cities, and address issues such as data access problems, algorithmic bias, AI ethics and transparency, and legal liability for AI decisions. We contrast the regulatory approaches of the U.S. and European Union, and close by making a number of recommendations for getting the most out of AI while still protecting important human values. 2

In order to maximize AI benefits, we recommend nine steps for going forward:

  • Encourage greater data access for researchers without compromising users’ personal privacy,
  • invest more government funding in unclassified AI research,
  • promote new models of digital education and AI workforce development so employees have the skills needed in the 21 st -century economy,
  • create a federal AI advisory committee to make policy recommendations,
  • engage with state and local officials so they enact effective policies,
  • regulate broad AI principles rather than specific algorithms,
  • take bias complaints seriously so AI does not replicate historic injustice, unfairness, or discrimination in data or algorithms,
  • maintain mechanisms for human oversight and control, and
  • penalize malicious AI behavior and promote cybersecurity.

Qualities of artificial intelligence

Although there is no uniformly agreed upon definition, AI generally is thought to refer to “machines that respond to stimulation consistent with traditional responses from humans, given the human capacity for contemplation, judgment and intention.” 3  According to researchers Shubhendu and Vijay, these software systems “make decisions which normally require [a] human level of expertise” and help people anticipate problems or deal with issues as they come up. 4 As such, they operate in an intentional, intelligent, and adaptive manner.

Intentionality

Artificial intelligence algorithms are designed to make decisions, often using real-time data. They are unlike passive machines that are capable only of mechanical or predetermined responses. Using sensors, digital data, or remote inputs, they combine information from a variety of different sources, analyze the material instantly, and act on the insights derived from those data. With massive improvements in storage systems, processing speeds, and analytic techniques, they are capable of tremendous sophistication in analysis and decisionmaking.

Artificial intelligence is already altering the world and raising important questions for society, the economy, and governance.

Intelligence

AI generally is undertaken in conjunction with machine learning and data analytics. 5 Machine learning takes data and looks for underlying trends. If it spots something that is relevant for a practical problem, software designers can take that knowledge and use it to analyze specific issues. All that is required are data that are sufficiently robust that algorithms can discern useful patterns. Data can come in the form of digital information, satellite imagery, visual information, text, or unstructured data.

Adaptability

AI systems have the ability to learn and adapt as they make decisions. In the transportation area, for example, semi-autonomous vehicles have tools that let drivers and vehicles know about upcoming congestion, potholes, highway construction, or other possible traffic impediments. Vehicles can take advantage of the experience of other vehicles on the road, without human involvement, and the entire corpus of their achieved “experience” is immediately and fully transferable to other similarly configured vehicles. Their advanced algorithms, sensors, and cameras incorporate experience in current operations, and use dashboards and visual displays to present information in real time so human drivers are able to make sense of ongoing traffic and vehicular conditions. And in the case of fully autonomous vehicles, advanced systems can completely control the car or truck, and make all the navigational decisions.

Related Content

Jack Karsten, Darrell M. West

October 26, 2015

Makada Henry-Nickie

November 16, 2017

Sunil Johal, Daniel Araya

February 28, 2017

Applications in diverse sectors

AI is not a futuristic vision, but rather something that is here today and being integrated with and deployed into a variety of sectors. This includes fields such as finance, national security, health care, criminal justice, transportation, and smart cities. There are numerous examples where AI already is making an impact on the world and augmenting human capabilities in significant ways. 6

One of the reasons for the growing role of AI is the tremendous opportunities for economic development that it presents. A project undertaken by PriceWaterhouseCoopers estimated that “artificial intelligence technologies could increase global GDP by $15.7 trillion, a full 14%, by 2030.” 7 That includes advances of $7 trillion in China, $3.7 trillion in North America, $1.8 trillion in Northern Europe, $1.2 trillion for Africa and Oceania, $0.9 trillion in the rest of Asia outside of China, $0.7 trillion in Southern Europe, and $0.5 trillion in Latin America. China is making rapid strides because it has set a national goal of investing $150 billion in AI and becoming the global leader in this area by 2030.

Meanwhile, a McKinsey Global Institute study of China found that “AI-led automation can give the Chinese economy a productivity injection that would add 0.8 to 1.4 percentage points to GDP growth annually, depending on the speed of adoption.” 8 Although its authors found that China currently lags the United States and the United Kingdom in AI deployment, the sheer size of its AI market gives that country tremendous opportunities for pilot testing and future development.

Investments in financial AI in the United States tripled between 2013 and 2014 to a total of $12.2 billion. 9 According to observers in that sector, “Decisions about loans are now being made by software that can take into account a variety of finely parsed data about a borrower, rather than just a credit score and a background check.” 10 In addition, there are so-called robo-advisers that “create personalized investment portfolios, obviating the need for stockbrokers and financial advisers.” 11 These advances are designed to take the emotion out of investing and undertake decisions based on analytical considerations, and make these choices in a matter of minutes.

A prominent example of this is taking place in stock exchanges, where high-frequency trading by machines has replaced much of human decisionmaking. People submit buy and sell orders, and computers match them in the blink of an eye without human intervention. Machines can spot trading inefficiencies or market differentials on a very small scale and execute trades that make money according to investor instructions. 12 Powered in some places by advanced computing, these tools have much greater capacities for storing information because of their emphasis not on a zero or a one, but on “quantum bits” that can store multiple values in each location. 13 That dramatically increases storage capacity and decreases processing times.

Fraud detection represents another way AI is helpful in financial systems. It sometimes is difficult to discern fraudulent activities in large organizations, but AI can identify abnormalities, outliers, or deviant cases requiring additional investigation. That helps managers find problems early in the cycle, before they reach dangerous levels. 14

National security

AI plays a substantial role in national defense. Through its Project Maven, the American military is deploying AI “to sift through the massive troves of data and video captured by surveillance and then alert human analysts of patterns or when there is abnormal or suspicious activity.” 15 According to Deputy Secretary of Defense Patrick Shanahan, the goal of emerging technologies in this area is “to meet our warfighters’ needs and to increase [the] speed and agility [of] technology development and procurement.” 16

Artificial intelligence will accelerate the traditional process of warfare so rapidly that a new term has been coined: hyperwar.

The big data analytics associated with AI will profoundly affect intelligence analysis, as massive amounts of data are sifted in near real time—if not eventually in real time—thereby providing commanders and their staffs a level of intelligence analysis and productivity heretofore unseen. Command and control will similarly be affected as human commanders delegate certain routine, and in special circumstances, key decisions to AI platforms, reducing dramatically the time associated with the decision and subsequent action. In the end, warfare is a time competitive process, where the side able to decide the fastest and move most quickly to execution will generally prevail. Indeed, artificially intelligent intelligence systems, tied to AI-assisted command and control systems, can move decision support and decisionmaking to a speed vastly superior to the speeds of the traditional means of waging war. So fast will be this process, especially if coupled to automatic decisions to launch artificially intelligent autonomous weapons systems capable of lethal outcomes, that a new term has been coined specifically to embrace the speed at which war will be waged: hyperwar.

While the ethical and legal debate is raging over whether America will ever wage war with artificially intelligent autonomous lethal systems, the Chinese and Russians are not nearly so mired in this debate, and we should anticipate our need to defend against these systems operating at hyperwar speeds. The challenge in the West of where to position “humans in the loop” in a hyperwar scenario will ultimately dictate the West’s capacity to be competitive in this new form of conflict. 17

Just as AI will profoundly affect the speed of warfare, the proliferation of zero day or zero second cyber threats as well as polymorphic malware will challenge even the most sophisticated signature-based cyber protection. This forces significant improvement to existing cyber defenses. Increasingly, vulnerable systems are migrating, and will need to shift to a layered approach to cybersecurity with cloud-based, cognitive AI platforms. This approach moves the community toward a “thinking” defensive capability that can defend networks through constant training on known threats. This capability includes DNA-level analysis of heretofore unknown code, with the possibility of recognizing and stopping inbound malicious code by recognizing a string component of the file. This is how certain key U.S.-based systems stopped the debilitating “WannaCry” and “Petya” viruses.

Preparing for hyperwar and defending critical cyber networks must become a high priority because China, Russia, North Korea, and other countries are putting substantial resources into AI. In 2017, China’s State Council issued a plan for the country to “build a domestic industry worth almost $150 billion” by 2030. 18 As an example of the possibilities, the Chinese search firm Baidu has pioneered a facial recognition application that finds missing people. In addition, cities such as Shenzhen are providing up to $1 million to support AI labs. That country hopes AI will provide security, combat terrorism, and improve speech recognition programs. 19 The dual-use nature of many AI algorithms will mean AI research focused on one sector of society can be rapidly modified for use in the security sector as well. 20

Health care

AI tools are helping designers improve computational sophistication in health care. For example, Merantix is a German company that applies deep learning to medical issues. It has an application in medical imaging that “detects lymph nodes in the human body in Computer Tomography (CT) images.” 21 According to its developers, the key is labeling the nodes and identifying small lesions or growths that could be problematic. Humans can do this, but radiologists charge $100 per hour and may be able to carefully read only four images an hour. If there were 10,000 images, the cost of this process would be $250,000, which is prohibitively expensive if done by humans.

What deep learning can do in this situation is train computers on data sets to learn what a normal-looking versus an irregular-appearing lymph node is. After doing that through imaging exercises and honing the accuracy of the labeling, radiological imaging specialists can apply this knowledge to actual patients and determine the extent to which someone is at risk of cancerous lymph nodes. Since only a few are likely to test positive, it is a matter of identifying the unhealthy versus healthy node.

AI has been applied to congestive heart failure as well, an illness that afflicts 10 percent of senior citizens and costs $35 billion each year in the United States. AI tools are helpful because they “predict in advance potential challenges ahead and allocate resources to patient education, sensing, and proactive interventions that keep patients out of the hospital.” 22

Criminal justice

AI is being deployed in the criminal justice area. The city of Chicago has developed an AI-driven “Strategic Subject List” that analyzes people who have been arrested for their risk of becoming future perpetrators. It ranks 400,000 people on a scale of 0 to 500, using items such as age, criminal activity, victimization, drug arrest records, and gang affiliation. In looking at the data, analysts found that youth is a strong predictor of violence, being a shooting victim is associated with becoming a future perpetrator, gang affiliation has little predictive value, and drug arrests are not significantly associated with future criminal activity. 23

Judicial experts claim AI programs reduce human bias in law enforcement and leads to a fairer sentencing system. R Street Institute Associate Caleb Watney writes:

Empirically grounded questions of predictive risk analysis play to the strengths of machine learning, automated reasoning and other forms of AI. One machine-learning policy simulation concluded that such programs could be used to cut crime up to 24.8 percent with no change in jailing rates, or reduce jail populations by up to 42 percent with no increase in crime rates. 24

However, critics worry that AI algorithms represent “a secret system to punish citizens for crimes they haven’t yet committed. The risk scores have been used numerous times to guide large-scale roundups.” 25 The fear is that such tools target people of color unfairly and have not helped Chicago reduce the murder wave that has plagued it in recent years.

Despite these concerns, other countries are moving ahead with rapid deployment in this area. In China, for example, companies already have “considerable resources and access to voices, faces and other biometric data in vast quantities, which would help them develop their technologies.” 26 New technologies make it possible to match images and voices with other types of information, and to use AI on these combined data sets to improve law enforcement and national security. Through its “Sharp Eyes” program, Chinese law enforcement is matching video images, social media activity, online purchases, travel records, and personal identity into a “police cloud.” This integrated database enables authorities to keep track of criminals, potential law-breakers, and terrorists. 27 Put differently, China has become the world’s leading AI-powered surveillance state.

Transportation

Transportation represents an area where AI and machine learning are producing major innovations. Research by Cameron Kerry and Jack Karsten of the Brookings Institution has found that over $80 billion was invested in autonomous vehicle technology between August 2014 and June 2017. Those investments include applications both for autonomous driving and the core technologies vital to that sector. 28

Autonomous vehicles—cars, trucks, buses, and drone delivery systems—use advanced technological capabilities. Those features include automated vehicle guidance and braking, lane-changing systems, the use of cameras and sensors for collision avoidance, the use of AI to analyze information in real time, and the use of high-performance computing and deep learning systems to adapt to new circumstances through detailed maps. 29

Light detection and ranging systems (LIDARs) and AI are key to navigation and collision avoidance. LIDAR systems combine light and radar instruments. They are mounted on the top of vehicles that use imaging in a 360-degree environment from a radar and light beams to measure the speed and distance of surrounding objects. Along with sensors placed on the front, sides, and back of the vehicle, these instruments provide information that keeps fast-moving cars and trucks in their own lane, helps them avoid other vehicles, applies brakes and steering when needed, and does so instantly so as to avoid accidents.

Advanced software enables cars to learn from the experiences of other vehicles on the road and adjust their guidance systems as weather, driving, or road conditions change. This means that software is the key—not the physical car or truck itself.

Since these cameras and sensors compile a huge amount of information and need to process it instantly to avoid the car in the next lane, autonomous vehicles require high-performance computing, advanced algorithms, and deep learning systems to adapt to new scenarios. This means that software is the key, not the physical car or truck itself. 30 Advanced software enables cars to learn from the experiences of other vehicles on the road and adjust their guidance systems as weather, driving, or road conditions change. 31

Ride-sharing companies are very interested in autonomous vehicles. They see advantages in terms of customer service and labor productivity. All of the major ride-sharing companies are exploring driverless cars. The surge of car-sharing and taxi services—such as Uber and Lyft in the United States, Daimler’s Mytaxi and Hailo service in Great Britain, and Didi Chuxing in China—demonstrate the opportunities of this transportation option. Uber recently signed an agreement to purchase 24,000 autonomous cars from Volvo for its ride-sharing service. 32

However, the ride-sharing firm suffered a setback in March 2018 when one of its autonomous vehicles in Arizona hit and killed a pedestrian. Uber and several auto manufacturers immediately suspended testing and launched investigations into what went wrong and how the fatality could have occurred. 33 Both industry and consumers want reassurance that the technology is safe and able to deliver on its stated promises. Unless there are persuasive answers, this accident could slow AI advancements in the transportation sector.

Smart cities

Metropolitan governments are using AI to improve urban service delivery. For example, according to Kevin Desouza, Rashmi Krishnamurthy, and Gregory Dawson:

The Cincinnati Fire Department is using data analytics to optimize medical emergency responses. The new analytics system recommends to the dispatcher an appropriate response to a medical emergency call—whether a patient can be treated on-site or needs to be taken to the hospital—by taking into account several factors, such as the type of call, location, weather, and similar calls. 34

Since it fields 80,000 requests each year, Cincinnati officials are deploying this technology to prioritize responses and determine the best ways to handle emergencies. They see AI as a way to deal with large volumes of data and figure out efficient ways of responding to public requests. Rather than address service issues in an ad hoc manner, authorities are trying to be proactive in how they provide urban services.

Cincinnati is not alone. A number of metropolitan areas are adopting smart city applications that use AI to improve service delivery, environmental planning, resource management, energy utilization, and crime prevention, among other things. For its smart cities index, the magazine Fast Company ranked American locales and found Seattle, Boston, San Francisco, Washington, D.C., and New York City as the top adopters. Seattle, for example, has embraced sustainability and is using AI to manage energy usage and resource management. Boston has launched a “City Hall To Go” that makes sure underserved communities receive needed public services. It also has deployed “cameras and inductive loops to manage traffic and acoustic sensors to identify gun shots.” San Francisco has certified 203 buildings as meeting LEED sustainability standards. 35

Through these and other means, metropolitan areas are leading the country in the deployment of AI solutions. Indeed, according to a National League of Cities report, 66 percent of American cities are investing in smart city technology. Among the top applications noted in the report are “smart meters for utilities, intelligent traffic signals, e-governance applications, Wi-Fi kiosks, and radio frequency identification sensors in pavement.” 36

Policy, regulatory, and ethical issues

These examples from a variety of sectors demonstrate how AI is transforming many walks of human existence. The increasing penetration of AI and autonomous devices into many aspects of life is altering basic operations and decisionmaking within organizations, and improving efficiency and response times.

At the same time, though, these developments raise important policy, regulatory, and ethical issues. For example, how should we promote data access? How do we guard against biased or unfair data used in algorithms? What types of ethical principles are introduced through software programming, and how transparent should designers be about their choices? What about questions of legal liability in cases where algorithms cause harm? 37

The increasing penetration of AI into many aspects of life is altering decisionmaking within organizations and improving efficiency. At the same time, though, these developments raise important policy, regulatory, and ethical issues.

Data access problems

The key to getting the most out of AI is having a “data-friendly ecosystem with unified standards and cross-platform sharing.” AI depends on data that can be analyzed in real time and brought to bear on concrete problems. Having data that are “accessible for exploration” in the research community is a prerequisite for successful AI development. 38

According to a McKinsey Global Institute study, nations that promote open data sources and data sharing are the ones most likely to see AI advances. In this regard, the United States has a substantial advantage over China. Global ratings on data openness show that U.S. ranks eighth overall in the world, compared to 93 for China. 39

But right now, the United States does not have a coherent national data strategy. There are few protocols for promoting research access or platforms that make it possible to gain new insights from proprietary data. It is not always clear who owns data or how much belongs in the public sphere. These uncertainties limit the innovation economy and act as a drag on academic research. In the following section, we outline ways to improve data access for researchers.

Biases in data and algorithms

In some instances, certain AI systems are thought to have enabled discriminatory or biased practices. 40 For example, Airbnb has been accused of having homeowners on its platform who discriminate against racial minorities. A research project undertaken by the Harvard Business School found that “Airbnb users with distinctly African American names were roughly 16 percent less likely to be accepted as guests than those with distinctly white names.” 41

Racial issues also come up with facial recognition software. Most such systems operate by comparing a person’s face to a range of faces in a large database. As pointed out by Joy Buolamwini of the Algorithmic Justice League, “If your facial recognition data contains mostly Caucasian faces, that’s what your program will learn to recognize.” 42 Unless the databases have access to diverse data, these programs perform poorly when attempting to recognize African-American or Asian-American features.

Many historical data sets reflect traditional values, which may or may not represent the preferences wanted in a current system. As Buolamwini notes, such an approach risks repeating inequities of the past:

The rise of automation and the increased reliance on algorithms for high-stakes decisions such as whether someone get insurance or not, your likelihood to default on a loan or somebody’s risk of recidivism means this is something that needs to be addressed. Even admissions decisions are increasingly automated—what school our children go to and what opportunities they have. We don’t have to bring the structural inequalities of the past into the future we create. 43

AI ethics and transparency

Algorithms embed ethical considerations and value choices into program decisions. As such, these systems raise questions concerning the criteria used in automated decisionmaking. Some people want to have a better understanding of how algorithms function and what choices are being made. 44

In the United States, many urban schools use algorithms for enrollment decisions based on a variety of considerations, such as parent preferences, neighborhood qualities, income level, and demographic background. According to Brookings researcher Jon Valant, the New Orleans–based Bricolage Academy “gives priority to economically disadvantaged applicants for up to 33 percent of available seats. In practice, though, most cities have opted for categories that prioritize siblings of current students, children of school employees, and families that live in school’s broad geographic area.” 45 Enrollment choices can be expected to be very different when considerations of this sort come into play.

Depending on how AI systems are set up, they can facilitate the redlining of mortgage applications, help people discriminate against individuals they don’t like, or help screen or build rosters of individuals based on unfair criteria. The types of considerations that go into programming decisions matter a lot in terms of how the systems operate and how they affect customers. 46

For these reasons, the EU is implementing the General Data Protection Regulation (GDPR) in May 2018. The rules specify that people have “the right to opt out of personally tailored ads” and “can contest ‘legal or similarly significant’ decisions made by algorithms and appeal for human intervention” in the form of an explanation of how the algorithm generated a particular outcome. Each guideline is designed to ensure the protection of personal data and provide individuals with information on how the “black box” operates. 47

Legal liability

There are questions concerning the legal liability of AI systems. If there are harms or infractions (or fatalities in the case of driverless cars), the operators of the algorithm likely will fall under product liability rules. A body of case law has shown that the situation’s facts and circumstances determine liability and influence the kind of penalties that are imposed. Those can range from civil fines to imprisonment for major harms. 48 The Uber-related fatality in Arizona will be an important test case for legal liability. The state actively recruited Uber to test its autonomous vehicles and gave the company considerable latitude in terms of road testing. It remains to be seen if there will be lawsuits in this case and who is sued: the human backup driver, the state of Arizona, the Phoenix suburb where the accident took place, Uber, software developers, or the auto manufacturer. Given the multiple people and organizations involved in the road testing, there are many legal questions to be resolved.

In non-transportation areas, digital platforms often have limited liability for what happens on their sites. For example, in the case of Airbnb, the firm “requires that people agree to waive their right to sue, or to join in any class-action lawsuit or class-action arbitration, to use the service.” By demanding that its users sacrifice basic rights, the company limits consumer protections and therefore curtails the ability of people to fight discrimination arising from unfair algorithms. 49 But whether the principle of neutral networks holds up in many sectors is yet to be determined on a widespread basis.

Recommendations

In order to balance innovation with basic human values, we propose a number of recommendations for moving forward with AI. This includes improving data access, increasing government investment in AI, promoting AI workforce development, creating a federal advisory committee, engaging with state and local officials to ensure they enact effective policies, regulating broad objectives as opposed to specific algorithms, taking bias seriously as an AI issue, maintaining mechanisms for human control and oversight, and penalizing malicious behavior and promoting cybersecurity.

Improving data access

The United States should develop a data strategy that promotes innovation and consumer protection. Right now, there are no uniform standards in terms of data access, data sharing, or data protection. Almost all the data are proprietary in nature and not shared very broadly with the research community, and this limits innovation and system design. AI requires data to test and improve its learning capacity. 50 Without structured and unstructured data sets, it will be nearly impossible to gain the full benefits of artificial intelligence.

In general, the research community needs better access to government and business data, although with appropriate safeguards to make sure researchers do not misuse data in the way Cambridge Analytica did with Facebook information. There is a variety of ways researchers could gain data access. One is through voluntary agreements with companies holding proprietary data. Facebook, for example, recently announced a partnership with Stanford economist Raj Chetty to use its social media data to explore inequality. 51 As part of the arrangement, researchers were required to undergo background checks and could only access data from secured sites in order to protect user privacy and security.

In the U.S., there are no uniform standards in terms of data access, data sharing, or data protection. Almost all the data are proprietary in nature and not shared very broadly with the research community, and this limits innovation and system design.

Google long has made available search results in aggregated form for researchers and the general public. Through its “Trends” site, scholars can analyze topics such as interest in Trump, views about democracy, and perspectives on the overall economy. 52 That helps people track movements in public interest and identify topics that galvanize the general public.

Twitter makes much of its tweets available to researchers through application programming interfaces, commonly referred to as APIs. These tools help people outside the company build application software and make use of data from its social media platform. They can study patterns of social media communications and see how people are commenting on or reacting to current events.

In some sectors where there is a discernible public benefit, governments can facilitate collaboration by building infrastructure that shares data. For example, the National Cancer Institute has pioneered a data-sharing protocol where certified researchers can query health data it has using de-identified information drawn from clinical data, claims information, and drug therapies. That enables researchers to evaluate efficacy and effectiveness, and make recommendations regarding the best medical approaches, without compromising the privacy of individual patients.

There could be public-private data partnerships that combine government and business data sets to improve system performance. For example, cities could integrate information from ride-sharing services with its own material on social service locations, bus lines, mass transit, and highway congestion to improve transportation. That would help metropolitan areas deal with traffic tie-ups and assist in highway and mass transit planning.

Some combination of these approaches would improve data access for researchers, the government, and the business community, without impinging on personal privacy. As noted by Ian Buck, the vice president of NVIDIA, “Data is the fuel that drives the AI engine. The federal government has access to vast sources of information. Opening access to that data will help us get insights that will transform the U.S. economy.” 53 Through its Data.gov portal, the federal government already has put over 230,000 data sets into the public domain, and this has propelled innovation and aided improvements in AI and data analytic technologies. 54 The private sector also needs to facilitate research data access so that society can achieve the full benefits of artificial intelligence.

Increase government investment in AI

According to Greg Brockman, the co-founder of OpenAI, the U.S. federal government invests only $1.1 billion in non-classified AI technology. 55 That is far lower than the amount being spent by China or other leading nations in this area of research. That shortfall is noteworthy because the economic payoffs of AI are substantial. In order to boost economic development and social innovation, federal officials need to increase investment in artificial intelligence and data analytics. Higher investment is likely to pay for itself many times over in economic and social benefits. 56

Promote digital education and workforce development

As AI applications accelerate across many sectors, it is vital that we reimagine our educational institutions for a world where AI will be ubiquitous and students need a different kind of training than they currently receive. Right now, many students do not receive instruction in the kinds of skills that will be needed in an AI-dominated landscape. For example, there currently are shortages of data scientists, computer scientists, engineers, coders, and platform developers. These are skills that are in short supply; unless our educational system generates more people with these capabilities, it will limit AI development.

For these reasons, both state and federal governments have been investing in AI human capital. For example, in 2017, the National Science Foundation funded over 6,500 graduate students in computer-related fields and has launched several new initiatives designed to encourage data and computer science at all levels from pre-K to higher and continuing education. 57 The goal is to build a larger pipeline of AI and data analytic personnel so that the United States can reap the full advantages of the knowledge revolution.

But there also needs to be substantial changes in the process of learning itself. It is not just technical skills that are needed in an AI world but skills of critical reasoning, collaboration, design, visual display of information, and independent thinking, among others. AI will reconfigure how society and the economy operate, and there needs to be “big picture” thinking on what this will mean for ethics, governance, and societal impact. People will need the ability to think broadly about many questions and integrate knowledge from a number of different areas.

One example of new ways to prepare students for a digital future is IBM’s Teacher Advisor program, utilizing Watson’s free online tools to help teachers bring the latest knowledge into the classroom. They enable instructors to develop new lesson plans in STEM and non-STEM fields, find relevant instructional videos, and help students get the most out of the classroom. 58 As such, they are precursors of new educational environments that need to be created.

Create a federal AI advisory committee

Federal officials need to think about how they deal with artificial intelligence. As noted previously, there are many issues ranging from the need for improved data access to addressing issues of bias and discrimination. It is vital that these and other concerns be considered so we gain the full benefits of this emerging technology.

In order to move forward in this area, several members of Congress have introduced the “Future of Artificial Intelligence Act,” a bill designed to establish broad policy and legal principles for AI. It proposes the secretary of commerce create a federal advisory committee on the development and implementation of artificial intelligence. The legislation provides a mechanism for the federal government to get advice on ways to promote a “climate of investment and innovation to ensure the global competitiveness of the United States,” “optimize the development of artificial intelligence to address the potential growth, restructuring, or other changes in the United States workforce,” “support the unbiased development and application of artificial intelligence,” and “protect the privacy rights of individuals.” 59

Among the specific questions the committee is asked to address include the following: competitiveness, workforce impact, education, ethics training, data sharing, international cooperation, accountability, machine learning bias, rural impact, government efficiency, investment climate, job impact, bias, and consumer impact. The committee is directed to submit a report to Congress and the administration 540 days after enactment regarding any legislative or administrative action needed on AI.

This legislation is a step in the right direction, although the field is moving so rapidly that we would recommend shortening the reporting timeline from 540 days to 180 days. Waiting nearly two years for a committee report will certainly result in missed opportunities and a lack of action on important issues. Given rapid advances in the field, having a much quicker turnaround time on the committee analysis would be quite beneficial.

Engage with state and local officials

States and localities also are taking action on AI. For example, the New York City Council unanimously passed a bill that directed the mayor to form a taskforce that would “monitor the fairness and validity of algorithms used by municipal agencies.” 60 The city employs algorithms to “determine if a lower bail will be assigned to an indigent defendant, where firehouses are established, student placement for public schools, assessing teacher performance, identifying Medicaid fraud and determine where crime will happen next.” 61

According to the legislation’s developers, city officials want to know how these algorithms work and make sure there is sufficient AI transparency and accountability. In addition, there is concern regarding the fairness and biases of AI algorithms, so the taskforce has been directed to analyze these issues and make recommendations regarding future usage. It is scheduled to report back to the mayor on a range of AI policy, legal, and regulatory issues by late 2019.

Some observers already are worrying that the taskforce won’t go far enough in holding algorithms accountable. For example, Julia Powles of Cornell Tech and New York University argues that the bill originally required companies to make the AI source code available to the public for inspection, and that there be simulations of its decisionmaking using actual data. After criticism of those provisions, however, former Councilman James Vacca dropped the requirements in favor of a task force studying these issues. He and other city officials were concerned that publication of proprietary information on algorithms would slow innovation and make it difficult to find AI vendors who would work with the city. 62 It remains to be seen how this local task force will balance issues of innovation, privacy, and transparency.

Regulate broad objectives more than specific algorithms

The European Union has taken a restrictive stance on these issues of data collection and analysis. 63 It has rules limiting the ability of companies from collecting data on road conditions and mapping street views. Because many of these countries worry that people’s personal information in unencrypted Wi-Fi networks are swept up in overall data collection, the EU has fined technology firms, demanded copies of data, and placed limits on the material collected. 64 This has made it more difficult for technology companies operating there to develop the high-definition maps required for autonomous vehicles.

The GDPR being implemented in Europe place severe restrictions on the use of artificial intelligence and machine learning. According to published guidelines, “Regulations prohibit any automated decision that ‘significantly affects’ EU citizens. This includes techniques that evaluates a person’s ‘performance at work, economic situation, health, personal preferences, interests, reliability, behavior, location, or movements.’” 65 In addition, these new rules give citizens the right to review how digital services made specific algorithmic choices affecting people.

By taking a restrictive stance on issues of data collection and analysis, the European Union is putting its manufacturers and software designers at a significant disadvantage to the rest of the world.

If interpreted stringently, these rules will make it difficult for European software designers (and American designers who work with European counterparts) to incorporate artificial intelligence and high-definition mapping in autonomous vehicles. Central to navigation in these cars and trucks is tracking location and movements. Without high-definition maps containing geo-coded data and the deep learning that makes use of this information, fully autonomous driving will stagnate in Europe. Through this and other data protection actions, the European Union is putting its manufacturers and software designers at a significant disadvantage to the rest of the world.

It makes more sense to think about the broad objectives desired in AI and enact policies that advance them, as opposed to governments trying to crack open the “black boxes” and see exactly how specific algorithms operate. Regulating individual algorithms will limit innovation and make it difficult for companies to make use of artificial intelligence.

Take biases seriously

Bias and discrimination are serious issues for AI. There already have been a number of cases of unfair treatment linked to historic data, and steps need to be undertaken to make sure that does not become prevalent in artificial intelligence. Existing statutes governing discrimination in the physical economy need to be extended to digital platforms. That will help protect consumers and build confidence in these systems as a whole.

For these advances to be widely adopted, more transparency is needed in how AI systems operate. Andrew Burt of Immuta argues, “The key problem confronting predictive analytics is really transparency. We’re in a world where data science operations are taking on increasingly important tasks, and the only thing holding them back is going to be how well the data scientists who train the models can explain what it is their models are doing.” 66

Maintaining mechanisms for human oversight and control

Some individuals have argued that there needs to be avenues for humans to exercise oversight and control of AI systems. For example, Allen Institute for Artificial Intelligence CEO Oren Etzioni argues there should be rules for regulating these systems. First, he says, AI must be governed by all the laws that already have been developed for human behavior, including regulations concerning “cyberbullying, stock manipulation or terrorist threats,” as well as “entrap[ping] people into committing crimes.” Second, he believes that these systems should disclose they are automated systems and not human beings. Third, he states, “An A.I. system cannot retain or disclose confidential information without explicit approval from the source of that information.” 67 His rationale is that these tools store so much data that people have to be cognizant of the privacy risks posed by AI.

In the same vein, the IEEE Global Initiative has ethical guidelines for AI and autonomous systems. Its experts suggest that these models be programmed with consideration for widely accepted human norms and rules for behavior. AI algorithms need to take into effect the importance of these norms, how norm conflict can be resolved, and ways these systems can be transparent about norm resolution. Software designs should be programmed for “nondeception” and “honesty,” according to ethics experts. When failures occur, there must be mitigation mechanisms to deal with the consequences. In particular, AI must be sensitive to problems such as bias, discrimination, and fairness. 68

A group of machine learning experts claim it is possible to automate ethical decisionmaking. Using the trolley problem as a moral dilemma, they ask the following question: If an autonomous car goes out of control, should it be programmed to kill its own passengers or the pedestrians who are crossing the street? They devised a “voting-based system” that asked 1.3 million people to assess alternative scenarios, summarized the overall choices, and applied the overall perspective of these individuals to a range of vehicular possibilities. That allowed them to automate ethical decisionmaking in AI algorithms, taking public preferences into account. 69 This procedure, of course, does not reduce the tragedy involved in any kind of fatality, such as seen in the Uber case, but it provides a mechanism to help AI developers incorporate ethical considerations in their planning.

Penalize malicious behavior and promote cybersecurity

As with any emerging technology, it is important to discourage malicious treatment designed to trick software or use it for undesirable ends. 70 This is especially important given the dual-use aspects of AI, where the same tool can be used for beneficial or malicious purposes. The malevolent use of AI exposes individuals and organizations to unnecessary risks and undermines the virtues of the emerging technology. This includes behaviors such as hacking, manipulating algorithms, compromising privacy and confidentiality, or stealing identities. Efforts to hijack AI in order to solicit confidential information should be seriously penalized as a way to deter such actions. 71

In a rapidly changing world with many entities having advanced computing capabilities, there needs to be serious attention devoted to cybersecurity. Countries have to be careful to safeguard their own systems and keep other nations from damaging their security. 72 According to the U.S. Department of Homeland Security, a major American bank receives around 11 million calls a week at its service center. In order to protect its telephony from denial of service attacks, it uses a “machine learning-based policy engine [that] blocks more than 120,000 calls per month based on voice firewall policies including harassing callers, robocalls and potential fraudulent calls.” 73 This represents a way in which machine learning can help defend technology systems from malevolent attacks.

To summarize, the world is on the cusp of revolutionizing many sectors through artificial intelligence and data analytics. There already are significant deployments in finance, national security, health care, criminal justice, transportation, and smart cities that have altered decisionmaking, business models, risk mitigation, and system performance. These developments are generating substantial economic and social benefits.

The world is on the cusp of revolutionizing many sectors through artificial intelligence, but the way AI systems are developed need to be better understood due to the major implications these technologies will have for society as a whole.

Yet the manner in which AI systems unfold has major implications for society as a whole. It matters how policy issues are addressed, ethical conflicts are reconciled, legal realities are resolved, and how much transparency is required in AI and data analytic solutions. 74 Human choices about software development affect the way in which decisions are made and the manner in which they are integrated into organizational routines. Exactly how these processes are executed need to be better understood because they will have substantial impact on the general public soon, and for the foreseeable future. AI may well be a revolution in human affairs, and become the single most influential human innovation in history.

Note: We appreciate the research assistance of Grace Gilberg, Jack Karsten, Hillary Schaub, and Kristjan Tomasson on this project.

The Brookings Institution is a nonprofit organization devoted to independent research and policy solutions. Its mission is to conduct high-quality, independent research and, based on that research, to provide innovative, practical recommendations for policymakers and the public. The conclusions and recommendations of any Brookings publication are solely those of its author(s), and do not reflect the views of the Institution, its management, or its other scholars.

Support for this publication was generously provided by Amazon. Brookings recognizes that the value it provides is in its absolute commitment to quality, independence, and impact. Activities supported by its donors reflect this commitment. 

John R. Allen is a member of the Board of Advisors of Amida Technology and on the Board of Directors of Spark Cognition. Both companies work in fields discussed in this piece.

  • Thomas Davenport, Jeff Loucks, and David Schatsky, “Bullish on the Business Value of Cognitive” (Deloitte, 2017), p. 3 (www2.deloitte.com/us/en/pages/deloitte-analytics/articles/cognitive-technology-adoption-survey.html).
  • Luke Dormehl, Thinking Machines: The Quest for Artificial Intelligence—and Where It’s Taking Us Next (New York: Penguin–TarcherPerigee, 2017).
  • Shubhendu and Vijay, “Applicability of Artificial Intelligence in Different Fields of Life.”
  • Andrew McAfee and Erik Brynjolfsson, Machine Platform Crowd: Harnessing Our Digital Future (New York: Norton, 2017).
  • Portions of this paper draw on Darrell M. West, The Future of Work: Robots, AI, and Automation , Brookings Institution Press, 2018.
  • PriceWaterhouseCoopers, “Sizing the Prize: What’s the Real Value of AI for Your Business and How Can You Capitalise?” 2017.
  • Dominic Barton, Jonathan Woetzel, Jeongmin Seong, and Qinzheng Tian, “Artificial Intelligence: Implications for China” (New York: McKinsey Global Institute, April 2017), p. 1.
  • Nathaniel Popper, “Stocks and Bots,” New York Times Magazine , February 28, 2016.
  • Michael Lewis, Flash Boys: A Wall Street Revolt (New York: Norton, 2015).
  • Cade Metz, “In Quantum Computing Race, Yale Professors Battle Tech Giants,” New York Times , November 14, 2017, p. B3.
  • Executive Office of the President, “Artificial Intelligence, Automation, and the Economy,” December 2016, pp. 27-28.
  • Christian Davenport, “Future Wars May Depend as Much on Algorithms as on Ammunition, Report Says,” Washington Post , December 3, 2017.
  • John R. Allen and Amir Husain, “On Hyperwar,” Naval Institute Proceedings , July 17, 2017, pp. 30-36.
  • Paul Mozur, “China Sets Goal to Lead in Artificial Intelligence,” New York Times , July 21, 2017, p. B1.
  • Paul Mozur and John Markoff, “Is China Outsmarting American Artificial Intelligence?” New York Times , May 28, 2017.
  • Economist , “America v China: The Battle for Digital Supremacy,” March 15, 2018.
  • Rasmus Rothe, “Applying Deep Learning to Real-World Problems,” Medium , May 23, 2017.
  • Eric Horvitz, “Reflections on the Status and Future of Artificial Intelligence,” Testimony before the U.S. Senate Subcommittee on Space, Science, and Competitiveness, November 30, 2016, p. 5.
  • Jeff Asher and Rob Arthur, “Inside the Algorithm That Tries to Predict Gun Violence in Chicago,” New York Times Upshot , June 13, 2017.
  • Caleb Watney, “It’s Time for our Justice System to Embrace Artificial Intelligence,” TechTank (blog), Brookings Institution, July 20, 2017.
  • Asher and Arthur, “Inside the Algorithm That Tries to Predict Gun Violence in Chicago.”
  • Paul Mozur and Keith Bradsher, “China’s A.I. Advances Help Its Tech Industry, and State Security,” New York Times , December 3, 2017.
  • Simon Denyer, “China’s Watchful Eye,” Washington Post , January 7, 2018.
  • Cameron Kerry and Jack Karsten, “Gauging Investment in Self-Driving Cars,” Brookings Institution, October 16, 2017.
  • Portions of this section are drawn from Darrell M. West, “Driverless Cars in China, Europe, Japan, Korea, and the United States,” Brookings Institution, September 2016.
  • Yuming Ge, Xiaoman Liu, Libo Tang, and Darrell M. West, “Smart Transportation in China and the United States,” Center for Technology Innovation, Brookings Institution, December 2017.
  • Peter Holley, “Uber Signs Deal to Buy 24,000 Autonomous Vehicles from Volvo,” Washington Post , November 20, 2017.
  • Daisuke Wakabayashi, “Self-Driving Uber Car Kills Pedestrian in Arizona, Where Robots Roam,” New York Times , March 19, 2018.
  • Kevin Desouza, Rashmi Krishnamurthy, and Gregory Dawson, “Learning from Public Sector Experimentation with Artificial Intelligence,” TechTank (blog), Brookings Institution, June 23, 2017.
  • Boyd Cohen, “The 10 Smartest Cities in North America,” Fast Company , November 14, 2013.
  • Teena Maddox, “66% of US Cities Are Investing in Smart City Technology,” TechRepublic , November 6, 2017.
  • Osonde Osoba and William Welser IV, “The Risks of Artificial Intelligence to Security and the Future of Work” (Santa Monica, Calif.: RAND Corp., December 2017) (www.rand.org/pubs/perspectives/PE237.html).
  • Ibid., p. 7.
  • Dominic Barton, Jonathan Woetzel, Jeongmin Seong, and Qinzheng Tian, “Artificial Intelligence: Implications for China” (New York: McKinsey Global Institute, April 2017), p. 7.
  • Executive Office of the President, “Preparing for the Future of Artificial Intelligence,” October 2016, pp. 30-31.
  • Elaine Glusac, “As Airbnb Grows, So Do Claims of Discrimination,” New York Times , June 21, 2016.
  • “Joy Buolamwini,” Bloomberg Businessweek , July 3, 2017, p. 80.
  • Mark Purdy and Paul Daugherty, “Why Artificial Intelligence is the Future of Growth,” Accenture, 2016.
  • Jon Valant, “Integrating Charter Schools and Choice-Based Education Systems,” Brown Center Chalkboard blog, Brookings Institution, June 23, 2017.
  • Tucker, “‘A White Mask Worked Better.’”
  • Cliff Kuang, “Can A.I. Be Taught to Explain Itself?” New York Times Magazine , November 21, 2017.
  • Yale Law School Information Society Project, “Governing Machine Learning,” September 2017.
  • Katie Benner, “Airbnb Vows to Fight Racism, But Its Users Can’t Sue to Prompt Fairness,” New York Times , June 19, 2016.
  • Executive Office of the President, “Artificial Intelligence, Automation, and the Economy” and “Preparing for the Future of Artificial Intelligence.”
  • Nancy Scolar, “Facebook’s Next Project: American Inequality,” Politico , February 19, 2018.
  • Darrell M. West, “What Internet Search Data Reveals about Donald Trump’s First Year in Office,” Brookings Institution policy report, January 17, 2018.
  • Ian Buck, “Testimony before the House Committee on Oversight and Government Reform Subcommittee on Information Technology,” February 14, 2018.
  • Keith Nakasone, “Testimony before the House Committee on Oversight and Government Reform Subcommittee on Information Technology,” March 7, 2018.
  • Greg Brockman, “The Dawn of Artificial Intelligence,” Testimony before U.S. Senate Subcommittee on Space, Science, and Competitiveness, November 30, 2016.
  • Amir Khosrowshahi, “Testimony before the House Committee on Oversight and Government Reform Subcommittee on Information Technology,” February 14, 2018.
  • James Kurose, “Testimony before the House Committee on Oversight and Government Reform Subcommittee on Information Technology,” March 7, 2018.
  • Stephen Noonoo, “Teachers Can Now Use IBM’s Watson to Search for Free Lesson Plans,” EdSurge , September 13, 2017.
  • Congress.gov, “H.R. 4625 FUTURE of Artificial Intelligence Act of 2017,” December 12, 2017.
  • Elizabeth Zima, “Could New York City’s AI Transparency Bill Be a Model for the Country?” Government Technology , January 4, 2018.
  • Julia Powles, “New York City’s Bold, Flawed Attempt to Make Algorithms Accountable,” New Yorker , December 20, 2017.
  • Sheera Frenkel, “Tech Giants Brace for Europe’s New Data Privacy Rules,” New York Times , January 28, 2018.
  • Claire Miller and Kevin O’Brien, “Germany’s Complicated Relationship with Google Street View,” New York Times , April 23, 2013.
  • Cade Metz, “Artificial Intelligence is Setting Up the Internet for a Huge Clash with Europe,” Wired , July 11, 2016.
  • Eric Siegel, “Predictive Analytics Interview Series: Andrew Burt,” Predictive Analytics Times , June 14, 2017.
  • Oren Etzioni, “How to Regulate Artificial Intelligence,” New York Times , September 1, 2017.
  • “Ethical Considerations in Artificial Intelligence and Autonomous Systems,” unpublished paper. IEEE Global Initiative, 2018.
  • Ritesh Noothigattu, Snehalkumar Gaikwad, Edmond Awad, Sohan Dsouza, Iyad Rahwan, Pradeep Ravikumar, and Ariel Procaccia, “A Voting-Based System for Ethical Decision Making,” Computers and Society , September 20, 2017 (www.media.mit.edu/publications/a-voting-based-system-for-ethical-decision-making/).
  • Miles Brundage, et al., “The Malicious Use of Artificial Intelligence,” University of Oxford unpublished paper, February 2018.
  • John Markoff, “As Artificial Intelligence Evolves, So Does Its Criminal Potential,” New York Times, October 24, 2016, p. B3.
  • Economist , “The Challenger: Technopolitics,” March 17, 2018.
  • Douglas Maughan, “Testimony before the House Committee on Oversight and Government Reform Subcommittee on Information Technology,” March 7, 2018.
  • Levi Tillemann and Colin McCormick, “Roadmapping a U.S.-German Agenda for Artificial Intelligence Policy,” New American Foundation, March 2017.

Artificial Intelligence

Governance Studies

Center for Technology Innovation

Artificial Intelligence and Emerging Technology Initiative

Andrew W. Wyckoff

September 20, 2024

Brahima Sangafowa Coulibaly, Landry Signé, George Ingram, Priya Vora, Rebecca Winthrop, Caren Grown, Belinda Archibong, Brad Olsen, Jennifer L. O’Donoghue, Sweta Shah, Ghulam Omar Qargha

September 19, 2024

Nicol Turner Lee

September 13, 2024

Academia.edu no longer supports Internet Explorer.

To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to  upgrade your browser .

Enter the email address you signed up with and we'll email you a reset link.

  • We're Hiring!
  • Help Center

paper cover thumbnail

THE IMPACT OF TECHNOLOGY ON HIGHER EDUCATION IN THE 21 st CENTURY: A SYSTEMATIC LITERATURE REVIEW

Profile image of Dr. Ahmad Shekib Popal

2024, GAP iNTERDISCIPLINARITIES A Global Journal of Interdisciplinary Studies, (ISSN - 2581-5628 ) Impact Factor: SJIF - 5.363, IIFS - 4.875

In the ever-evolving landscape of 21st-century higher education, this article delves into the transformative role technology plays in reshaping how we acquire, disseminate, and apply knowledge. From the traditional chalkboards to interactive screens, the evolution has been revolutionary, woven into the fabric of our daily lives. The exploration draws on scholarly sources, navigating through digital tools, platforms, and strategies, from classrooms to online environments, and from augmented reality to artificial intelligence. The literature review assesses the remarkable transformation catalyzed by digital technologies, examining themes such as digital natives, blended learning, immersive technologies, adaptive learning, and data analytics. It uncovers both opportunities and challenges, addressing issues of equity and ethical considerations. The research questions focus on technology's impact on student engagement, learning outcomes, and equitable access. Objectives include elevating student digital literacy and enhancing teacher proficiency in online pedagogy. The methodology combines a comprehensive literature review with practical interventions and data analysis. The article concludes by emphasizing the dynamic nature of technology in education, acknowledging challenges, and calling for ongoing research and critical evaluation to shape the future of learning.

Related Papers

South African Computer Journal

Reuben Dlamini

modern technology in the 21st century essay

Advances in Higher Education and Professional Development

Sheri Conklin

The current status of today's society is driven by and involves technology. Many people cannot function without their cell-phones, social media, gadgets, tablets, and other forms of technology for which people interact. Many of these technologies depend upon and are utilized within an online context. However, as it pertains to online learning environments, many faculty struggle with developing and implementing opportunities that builds a sense of community for their learners. This chapter: 1) Discusses key factors that impact student engagement, 2) Addresses factors that facilitate continued engagement for diverse online learners, 3) Provides evidence-based practices for creating and sustaining online learner engagement, and 4) Offers real world suggestions from the online teaching experience of chapter's authors.

International Journal of Creative Research Thoughts (IJCRT)

Dr. Anamika Ahirwar , Mahendra Singh Panwar

Digital technology has become an indispensable component of modern education, revolutionizing the learning process in profound ways. This research paper provides an in-depth examination of the multifaceted role of digital technology in shaping contemporary learning environments. Through an extensive review of existing literature, this paper explores the impact of digital technology on student engagement, pedagogical practices, educational outcomes, and the overall learning experience. Additionally, it addresses the challenges and opportunities associated with the integration of digital technology in education, including issues such as access, equity, privacy, and security. By synthesizing current research findings and best practices, this paper aims to provide valuable insights into how digital technology can be effectively leveraged to enhance teaching and learning in the digital age.

Abdullah Saykili

The dominant roles that digital connective technologies have in the 21st century are causing profound changes in all domains of life, which signal that we have reached a new age: the digital age. Education is one of the fundamental domains of life re-engineered to adopt to the changing landscape of what it means to function in this new age. The school paradigm which rests on the conditions and requirements of the industrial age appears to fall short in terms of meeting the needs and demands of the 21st century learner. The emerging digital connective technologies and the educational innovations they triggered such as open educational resources (OER), massive online open courses (MOOCs) and learning analytics are disrupting the learning processes and structures of the industrial age such that it is now an imperative to develop a new educational paradigm. These new innovations enable learners to extend learning outside the boundaries of traditional learning institutions through informal and enriched learning experiences using online communities on new platforms such as social media and other social platforms. The digital innovations aforementioned also free the learners from the shackles of time so that learners can, not only access but also create knowledge through social interaction and collaboration. The age we live in is ripe for unprecedented fundamental changes and opportunities for higher education (HE). Therefore, policymakers involved in education need to rethink the implications of digital connective technologies, the challenges and opportunities they bring to the educational scene while developing value-added policies regarding HE. This paper addresses the learner, instructor, learning environments and the administration dimensions of HE and how the digital connective technologies are impacting on these dimensions in the digital age. The paper also offers, as a conclusion, a road map for HE to better function in this age.

Educational Administration: Theory and Practice

Jayanta Mete

Global educational systems face ongoing and increasing demands to incorporate contemporary communication and information technologies into their teaching methods in order to provide students with the necessary knowledge and skills for the 21st century. The use of computers and computer-mediated communication and information are becoming more and more integrated into educational curriculum development. Technological innovations are often seen as mere instruments used to augment the process of teaching and learning. The study attempts to identify the role of digital education in higher education and to find the relationship between both. The study is done using secondary data. After exploring multiple studies, the findings of the paper suggest that it becomes evident that digital education has a significant role in student engagement, learning outcomes, and overall educational experiences. The review offers valuable insights for educators, policymakers, and institutions to effectively integrate digital education and meet the evolving needs of higher education in the digital era.

Library Hi Tech News

Jutta Treviranus

International Journal of Educational Technology in Higher Education

Melissa Bond

Digital technology has become a central aspect of higher education, inherently affecting all aspects of the student experience. It has also been linked to an increase in behavioural, affective and cognitive student engagement, the facilitation of which is a central concern of educators. In order to delineate the complex nexus of technology and student engagement, this article systematically maps research from 243 studies published between 2007 and 2016. Research within the corpus was predominantly undertaken within the United States and the United Kingdom, with only limited research undertaken in the Global South, and largely focused on the fields of Arts & Humanities, Education, and Natural Sciences, Mathematics & Statistics. Studies most often used quantitative methods, followed by mixed methods, with little qualitative research methods employed. Few studies provided a definition of student engagement, and less than half were guided by a theoretical framework. The courses investigated used blended learning and text-based tools (e.g. discussion forums) most often, with undergraduate students as the primary target group. Stemming from the use of educational technology, behavioural engagement was by far the most often identified dimension, followed by affective and cognitive engagement. This mapping article provides the grounds for further exploration into discipline-specific use of technology to foster student engagement.

Educational Review

Gordon Mikoski

Nota Bene 2014: 20th Anniversary Phi Theta Kappa Honor Society Anthology

Lisa Haygood

The change of political party platforms ushered in with the democratic victory of Barak Obama in 2008 resulted in a distinct shift in public educational efforts from the “No Child Left Behind” standardization championed by the George W. Bush White House; refocusing attention on the American post secondary education system and underscoring the common core belief that a college education should and would be the goal of every graduating high school senior. Online courses will continue to augment traditional curriculum offerings and provide more students with the flexibility to begin, enhance and/or complete their degree. As with countless industries before it, post secondary education will and is being transformed by technology – in and out of the traditional classroom. It is critical that lawmakers, public and private institutions, educators, private corporations and entrepreneurs embrace the IT revolution that higher education is already immersed in and strive to maintain the affordability of these courses through cooperative authorship and deliverance. Early stumbles and hiccups have long-since given way to a viable, affordable, and statistically successful adjunct to higher education in America and internationally. The US must maintain its leading role in the quest for refined online education standards of development and delivery in order to provide the opportunity of a quality education and a path to fulfilling the American Dream.

Studia paedagogica

Eliana Esther Gallardo Echenique

Loading Preview

Sorry, preview is currently unavailable. You can download the paper by clicking the button above.

RELATED PAPERS

Sue Watling

International Journal of Linguistics, Literature and Translation

farida mokhtari

The Promise of Higher Education

Carles Sigalés

Innovative Practice in Higher Education

Carmel Thomason

International Journal of Research

Thelma C H A N S A Chanda

Scientific Research Publishing

Anna Bedford

Journal of King Saud University - Computer and Information Sciences

Demetrios Sampson

Sue Renes , Anthony Strange

Dr Russell Butson

Arianne Rourke , kathryn coleman

Contemporary Issues in Technology and Teacher Education

International Journal of Science, Mathematics and Technology Learning

Solomon Nsor-Anabiah , Ruhiya Abubakar , Owusu Antwi

Crina Damşa

Adriana Dana D P Listes Pop

Journal of Information Technology Education: Innovations in Practice

Crystal Fulton

Bani Koumachi

Developments in Business Simulation and Experiential Learning

Australasian Journal of Educational Technology

Digital Education Review

Carlinda Leite

  •   We're Hiring!
  •   Help Center
  • Find new research papers in:
  • Health Sciences
  • Earth Sciences
  • Cognitive Science
  • Mathematics
  • Computer Science
  • Academia ©2024
  • Share full article

Advertisement

Supported by

How Technology Is Changing the Future of Higher Education

Labs test artificial intelligence, virtual reality and other innovations that could improve learning and lower costs for Generation Z and beyond.

modern technology in the 21st century essay

By Jon Marcus

This article is part of our latest Learning special report . We’re focusing on Generation Z, which is facing challenges from changing curriculums and new technology to financial aid gaps and homelessness.

MANCHESTER, N.H. — Cruising to class in her driverless car, a student crams from notes projected on the inside of the windshield while she gestures with her hands to shape a 3-D holographic model of her architecture project.

It looks like science fiction, an impression reinforced by the fact that it is being demonstrated in virtual reality in an ultramodern space with overstuffed pillows for seats. But this scenario is based on technology already in development.

The setting is the Sandbox ColLABorative, the innovation arm of Southern New Hampshire University, on the fifth floor of a downtown building with panoramic views of the sprawling red brick mills that date from this city’s 19th-century industrial heyday.

It is one of a small but growing number of places where experts are testing new ideas that will shape the future of a college education, using everything from blockchain networks to computer simulations to artificial intelligence, or A.I.

Theirs is not a future of falling enrollment, financial challenges and closing campuses. It’s a brighter world in which students subscribe to rather than enroll in college, learn languages in virtual reality foreign streetscapes with avatars for conversation partners, have their questions answered day or night by A.I. teaching assistants and control their own digital transcripts that record every life achievement.

We are having trouble retrieving the article content.

Please enable JavaScript in your browser settings.

Thank you for your patience while we verify access. If you are in Reader mode please exit and  log into  your Times account, or  subscribe  for all of The Times.

Thank you for your patience while we verify access.

Already a subscriber?  Log in .

Want all of The Times?  Subscribe .

21st Century Communication Technology Essay

  • To find inspiration for your paper and overcome writer’s block
  • As a source of information (ensure proper referencing)
  • As a template for you assignment

The changing communication technology and the presence of the internet have greatly impacted the way firms conduct business. It is now possible to conduct business using resources that are virtual in nature while still earning a reasonable revenue of profit and revenue from the operations with minimal investments. The communications technology has dramatically changed the way people in a company interact and communicate with each other for business as well as personal purposes.

The most common forms of technology that have been used over the period of time for communication in a company pertain to face to face communication, memos, letters, bulletin boards as well as financial reports. The selection of type of media is based on the purpose of the communication and the audience being targeted. Face to face communication is personal in nature and an immediate form of communication where a two way flow of ideas is possible.

On the other hand, communication media like bulletin boards and financial reports are drawn up for a certain audience targeting mass reach. In the 21st century however it is now possible to conduct business and communicate with the employees using innovative technologies like email, SMS, video conferencing and hand held devices like PDA’s and BlackBerry (Lengel & Daft, 1988) The use of this technology can also help the company increase two way communication in the management making way for an efficient flow of ideas. Strategic implementation of the media can help in connecting with the lower management and performing any conflict resolution that would otherwise go untreated leading to increase in employee dissatisfaction (‘Whispering Class Must Be Heard’, 2008)

A firm that works on the tax returns for clients needs to communicate with the clients and their staff in an efficient and immediate manner for resolving any issues that may come up during the drawing of papers and the pre[parathion of tax returns. In this regard it is beneficial for the fri9mtomake use of modern communication technology for communicating with their clients and their staff. The firm can make use of SMS to communicate with their staff and inform of any urgent meetings to them.

The SMS option can also be used to inform the clients about any sudden change in plans or to schedule a meeting with them where direct communication at the moment is not possible. Aside from this Email is a option that can be employed to provide the clients with updates in their tax returns and inform of any discrepancies and issues that may come up. The staff can also be delegated work and kept in the work loop using detailed emails with attachments for tax return evidence etc.

The video conferencing option can be used to establish a communication link between the client and the staff working on the tax returns for face to face meetings where a direct face to face meeting is not possible due to geographic or time constraints.

While the modern communication media can be expensive to acquire and use in the firm, it is important to note as well, that its use and implementation can help the firm attain competitive advantage in operations through greater efficiency and increased personal services that it can offer to its customers. The 21st century communication media can be used to strategically motivate and reward the employees where instead of providing them with cash bonus or raise, a BlackBerry or an iPod can be provided. (‘Rewarding a Job Well Done’, 2008) This helps increase the motivation of the employees with returns that are substantial in nature and can be used for business purposes as well.

‘Rewarding a Job Well Done’, LW , 2008.

‘Whispering Class Must Be Heard’, 2008.

Lengel, R.H., Daft, R.L., ‘The Selection of Communication Media as an Executive Skill’, Academy of Management Executive , 1998, 2, no. 3, pp. 225-32.

  • Peer Pressure: Issue Review
  • Interpersonal Communication Skills and Self-Disclosure
  • The Social Implications of the Blackberry Technology
  • Blackberry's Organizational Prospects
  • SMS Traffic Jam Alert for Drives Across UAE
  • Two Friends Who Are Not Speaking to Each Other
  • "The Effects of Sexual Harassment on Job Satisfaction" by Laband and Lentz
  • Boys and Girls Misunderstandings: Personal Case
  • Effects of Laughter on People
  • Interpersonal Effectiveness: How to Achieve
  • Chicago (A-D)
  • Chicago (N-B)

IvyPanda. (2021, November 10). 21st Century Communication Technology. https://ivypanda.com/essays/21st-century-communication-technology/

"21st Century Communication Technology." IvyPanda , 10 Nov. 2021, ivypanda.com/essays/21st-century-communication-technology/.

IvyPanda . (2021) '21st Century Communication Technology'. 10 November.

IvyPanda . 2021. "21st Century Communication Technology." November 10, 2021. https://ivypanda.com/essays/21st-century-communication-technology/.

1. IvyPanda . "21st Century Communication Technology." November 10, 2021. https://ivypanda.com/essays/21st-century-communication-technology/.

Bibliography

IvyPanda . "21st Century Communication Technology." November 10, 2021. https://ivypanda.com/essays/21st-century-communication-technology/.

IvyPanda uses cookies and similar technologies to enhance your experience, enabling functionalities such as:

  • Basic site functions
  • Ensuring secure, safe transactions
  • Secure account login
  • Remembering account, browser, and regional preferences
  • Remembering privacy and security settings
  • Analyzing site traffic and usage
  • Personalized search, content, and recommendations
  • Displaying relevant, targeted ads on and off IvyPanda

Please refer to IvyPanda's Cookies Policy and Privacy Policy for detailed information.

Certain technologies we use are essential for critical functions such as security and site integrity, account authentication, security and privacy preferences, internal site usage and maintenance data, and ensuring the site operates correctly for browsing and transactions.

Cookies and similar technologies are used to enhance your experience by:

  • Remembering general and regional preferences
  • Personalizing content, search, recommendations, and offers

Some functions, such as personalized recommendations, account preferences, or localization, may not work correctly without these technologies. For more details, please refer to IvyPanda's Cookies Policy .

To enable personalized advertising (such as interest-based ads), we may share your data with our marketing and advertising partners using cookies and other technologies. These partners may have their own information collected about you. Turning off the personalized advertising setting won't stop you from seeing IvyPanda ads, but it may make the ads you see less relevant or more repetitive.

Personalized advertising may be considered a "sale" or "sharing" of the information under California and other state privacy laws, and you may have the right to opt out. Turning off personalized advertising allows you to exercise your right to opt out. Learn more in IvyPanda's Cookies Policy and Privacy Policy .

  • Newsletters

Smartphone innovation in the third decade of the 21st century

Provided by Tecno Mobile

2019 was a year of triumphs and challenges for the smartphone industry. It was a time when manufacturers encountered an almost continual decline of global shipments—but it also marked the introduction of sophisticated new features such as foldable screens and long-awaited 5G technology. Though far from mature, these early drafts are helping establish a solid foundation and direction for smartphone technology in 2020.

From the day they debuted, mobile phones have been evolving. The world has now entered an era of cell phones with superior functions. Over the past 20 years, the following aspects of mobile devices have undergone significant changes:

Style and appearance. One of the most observable changes is the look and feel of mobile phones. They have morphed from their original candy-bar form to the iconic flip phone, which enjoyed great popularity for a long time. Then the slide phone took its place, thanks to its modern design and convenient operation. Next was the touchscreen phone, the most widely available style in the current smartphone market.   

Smartphone innovation in the third decade of the 21st century

Size. Like computers, when mobile phones first came out, they were big and bulky; today they’re small and lightweight. The goal throughout their evolution was to meet the needs and expectations of users—to be more portable and user-friendly.

Function. Of course, mobile phones, starting way back with telephones, were invented for communication. In the past, phones played a single role: they allowed people to make calls. Thanks to the development of networks, technology, and social needs, phones today let users do much more—send and receive text messages and emails, take photos and videos, access the internet, listen to music, and play games, among many other functions. That’s not to mention artificial intelligence (AI) technology, steadily making its way into mobile phones and allowing for things like human-machine interaction—“Alexa, add milk and eggs to my shopping list.”   

Image/camera technology. The photography function is one of the most remarkable changes in modern smartphones. In the beginning, most smartphones had a single, rear-facing camera for photo shooting. Then the smartphone camera came of age: it moved to the screen display, facing the user, then adding video, high-definition, night-mode, and anti-shake technology. All these features together make the photography function the most valuable addition to the modern smartphone.

Smartphone innovation in the third decade of the 21st century

Stepping into 2020, here are six trends that users can expect to see in the latest generation of smartphones:

More screen. Most smartphone companies have introduced phones with full screens. There’s no denying that full screens have advantages—they give the smartphone an outstanding screen display and stunning visual effects. The full-screen design, to some extent, drives creative development in the smartphone industry.

One important feature of full-screen displays over the past few years is the notch—the black cutout, typically rectangular, at the top of the phone that houses sensors, speaker, and the phone’s front-facing camera. Two new designs attempting to maximize screen space—the water-drop notch and no-notch displays—will most likely be the dominant smartphone design in 2020. Compared with the more prominent monobrow notch, the water-drop notch display takes up less room and can have a screen-to-body ratio of 85%. The no-notch display is new to the market and has the highest screen-to-body ratio, in some phones thanks to a pop-up camera at the top of the phone.

Mobile photography. Super-high-definition cameras are a goal that smartphone companies will continue to chase after, as social media becomes an increasingly important part of modern life. People are spending more time on social media than ever before. Not only do they send and receive written messages through platforms such as Facebook and Twitter, they are also using smartphone cameras to capture and share important life moments, hence the demand for ever-higher photo quality.

In the second half of 2019, smartphones with 64-megapixel cameras were released to the market. This high-res imaging technology will be a big selling point in 2020, with 80% of smartphone companies putting out smartphones fitted with these cameras. The demand for better mobile photography dovetails with the introduction of 5G technology, which allows more data to be transmitted over wireless networks.

Foldable phones. 2019’s CES show in Las Vegas highlighted that foldable technology is on the rise, particularly foldable personal computers. Limited by cost and technology, foldable smartphones have not been widely accepted by consumers. However, with the potential for an even greater screen size, an evolution back towards foldable phones may be likely. Smartphone manufacturers are continuing to innovate around foldable models and the industry expects to see new breakthroughs in the decade ahead.

Artificial intelligence. AI technology will significantly enhance the user experience, improving smartphones’ sensing, analyzing, and interacting functionality. For example, with on-device sensors such as Wi-Fi, Bluetooth, and Global Positioning System technology, a smartphone can create a rich, offline profile of its user and even foresee his or her needs, then make suggestions that will help fulfill them. And through machine-human interaction, the smartphone can collect feedback, continually correcting mistakes so that it’s in lockstep with the user.

As AI gets infused into smartphones in 2020 and beyond, it will also help improve photos, battery life, and cybersecurity.

5G. 2020 is considered the “first year” of 5G communication, which many analysts and observers predict will invigorate and restructure the telecoms industry. 5G technology promises to vastly boost the speed and widen the coverage of wireless networks, and its advent is a huge opportunity for smartphone companies, with hundreds of millions of 5G phones expected to ship in 2020.

Processors. There is still room for improving mobile processors in 2020, to keep pace with smartphone industry trends. More powerful processors mean improved graphics performance. The success of the Nintendo Switch video game console shows that smartphones with high-quality visuals are popular among users. Moreover, modern mobile chips that can support a smartphone’s camera, video, audio, gesture recognition, and other functionality have become mainstream, overtaking simple processor design. Smartphones this year will also see built-in 5G connectivity.

Keep Reading

Most popular.

He Jiankui in profile looking to a computer screen out of frame

A controversial Chinese CRISPR scientist is still hopeful about embryo gene editing. Here’s why.

He Jiankui, who went to prison for three years for making the world’s first gene-edited babies, talked to MIT Technology Review about his new research plans.

  • Zeyi Yang archive page

""

Why OpenAI’s new model is such a big deal

The bulk of LLM progress until now has been language-driven. This new model enters the realm of complex reasoning, with implications for physics, coding, and more.

  • James O'Donnell archive page

a protractor, a child writing math problems on a blackboard and a German text on geometry

Google DeepMind’s new AI systems can now solve complex math problems

AlphaProof and AlphaGeometry 2 are steps toward building systems that can reason, which could unlock exciting new capabilities.

  • Rhiannon Williams archive page

modern technology in the 21st century essay

Meet the radio-obsessed civilian shaping Ukraine’s drone defense

Since Russia’s invasion, Serhii “Flash” Beskrestnov has become an influential, if sometimes controversial, force—sharing expert advice and intel on the ever-evolving technology that’s taken over the skies. His work may determine the future of Ukraine, and wars far beyond it.

  • Charlie Metcalfe archive page

Stay connected

Get the latest updates from mit technology review.

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at [email protected] with a list of newsletters you’d like to receive.

Information technologies of 21st century and their impact on the society

  • Original Research
  • Published: 16 August 2019
  • Volume 11 , pages 759–766, ( 2019 )

Cite this article

modern technology in the 21st century essay

  • Mohammad Yamin   ORCID: orcid.org/0000-0002-3778-3366 1  

41k Accesses

25 Citations

Explore all metrics

Twenty first century has witnessed emergence of some ground breaking information technologies that have revolutionised our way of life. The revolution began late in 20th century with the arrival of internet in 1995, which has given rise to methods, tools and gadgets having astonishing applications in all academic disciplines and business sectors. In this article we shall provide a design of a ‘spider robot’ which may be used for efficient cleaning of deadly viruses. In addition, we shall examine some of the emerging technologies which are causing remarkable breakthroughs and improvements which were inconceivable earlier. In particular we shall look at the technologies and tools associated with the Internet of Things (IoT), Blockchain, Artificial Intelligence, Sensor Networks and Social Media. We shall analyse capabilities and business value of these technologies and tools. As we recognise, most technologies, after completing their commercial journey, are utilised by the business world in physical as well as in the virtual marketing environments. We shall also look at the social impact of some of these technologies and tools.

Similar content being viewed by others

modern technology in the 21st century essay

Integration of Artificial Intelligence and the Internet of Things with Blockchain Technology

modern technology in the 21st century essay

Known Unknowns in an Era of Technological and Viral Disruptions—Implications for Theory, Policy, and Practice

modern technology in the 21st century essay

Internet of Things: A Review on Its Applications

Explore related subjects.

  • Artificial Intelligence

Avoid common mistakes on your manuscript.

1 Introduction

Internet, which was started in 1989 [ 1 ], now has 1.2 million terabyte data from Google, Amazon, Microsoft and Facebook [ 2 ]. It is estimated that the internet contains over four and a half billion websites on the surface web, the deep web, which we know very little about, is at least four hundred times bigger than the surface web [ 3 ]. Soon afterwards in 1990, email platform emerged and then many applications. Then we saw a chain of web 2.0 technologies like E-commerce, which started, social media platforms, E-Business, E-Learning, E-government, Cloud Computing and more from 1995 to the early 21st century [ 4 ]. Now we have a large number of internet based technologies which have uncountable applications in many domains including business, science and engineering, and healthcare [ 5 ]. The impact of these technologies on our personal lives is such that we are compelled to adopt many of them whether we like it or not.

In this article we shall study the nature, usage and capabilities of the emerging and future technologies. Some of these technologies are Big Data Analytics, Internet of Things (IoT), Sensor networks (RFID, Location based Services), Artificial Intelligence (AI), Robotics, Blockchain, Mobile digital Platforms (Digital Streets, towns and villages), Clouds (Fog and Dew) computing, Social Networks and Business, Virtual reality.

With the ever increasing computing power and declining costs of data storage, many government and private organizations are gathering enormous amounts of data. Accumulated data from the years’ of acquisition and processing in many organizations has become enormous meaning that it can no longer be analyzed by traditional tools within a reasonable time. Familiar disciplines to create Big data include astronomy, atmospheric science, biology, genomics, nuclear physics, biochemical experiments, medical records, and scientific research. Some of the organizations responsible to create enormous data are Google, Facebook, YouTube, hospitals, proceedings of parliaments, courts, newspapers and magazines, and government departments. Because of its size, analysis of big data is not a straightforward task and often requires advanced methods and techniques. Lack of timely analysis of big data in certain domains may have devastating results and pose threats to societies, nature and echo system.

2.1 Big medic data

Healthcare field is generating big data, which has the potential to surpass other fields when it come to the growth of data. Big Medic data usually refers to considerably bigger pool of health, hospital and treatment records, medical claims of administrative nature, and data from clinical trials, smartphone applications, wearable devices such as RFID and heart beat reading devices, different kinds of social media, and omics-research. In particular omics-research (genomics, proteomics, metabolomics etc.) is leading the charge to the growth of Big data [ 6 , 7 ]. The challenges in omics-research are data cleaning, normalization, biomolecule identification, data dimensionality reduction, biological contextualization, statistical validation, data storage and handling, sharing and data archiving. Data analytics requirements include several tasks like those of data cleaning, normalization, biomolecule identification, data dimensionality reduction, biological contextualization, statistical validation, data storage and handling, sharing and data archiving. These tasks are required for the Big data in some of the omics datasets like genomics, tran-scriptomics, proteomics, metabolomics, metagenomics, phenomics [ 6 ].

According to [ 8 ], in 2011 alone, the data in the United States of America healthcare system amounted to one hundred and fifty Exabyte (One Exabyte = One billion Gigabytes, or 10 18  Bytes), and is expected soon reach to 10 21 and later 10 24 . Some scientist have classified Medical into three categories having (a) large number of samples but small number of parameters; (b) small number of samples and small number of parameters; (c) large small number of samples and small number of parameters [ 9 ]. Although the data in the first category may be analyzed by classical methods but it may be incomplete, noisy, and inconsistent, data cleaning. The data in the third category could be big and may require advanced analytics.

2.2 Big data analytics

Big data cannot be analyzed in real time by traditional analytical methods. The analysis of Big data, popularly known as Big Data Analytics, often involves a number of technologies, sophisticated processes and tools as depicted in Fig.  1 . Big data can provide smart decision making and business intelligence to the businesses and corporations. Big data unless analyzed is impractical and a burden to the organization. Big data analytics involves mining and extracting useful associations (knowledge discovery) for intelligent decision-making and forecasts. The challenges in Big Data analytics are computational complexities, scalability and visualization of data. Consequently, the information security risk increases with the surge in the amount of data, which is the case in Big Data.

figure 1

Big Data Analytics

The aim of data analytics has always been knowledge discovery to support smart and timely decision making. With big data, knowledge base becomes widened and sharper to provide greater business intelligence and assist businesses in becoming a leader in the market. Conventional processing paradigm and architecture are inefficient to deal with the large datasets from the Big data. Some of the problems of Big Data are to deal with the size of data sets in Big Data, requiring parallel processing. Some of the recent technologies like Spark, Hadoop, Map Reduce, R, Data Lakes and NoSQL have emerged to provide Big Data analytics. With all these and other data analytics technologies, it is advantageous to invest in designing superior storage systems.

Health data predominantly consists of visual, graphs, audio and video data. Analysing such data to gain meaningful insights and diagnoses may depend on the choice of tools. Medical data has traditionally been scattered in the organization, often not organized properly. What we find usually are medical record keeping systems which consist of heterogeneous data, requiring more efforts to reorganize the data into a common platform. As discussed before, the health profession produces enormous data and so analysing it in an efficient and timely manner can potentially save many lives.

Commercial operations of Clouds from the company platforms began in the year 1999 [ 10 ]. Initially, clouds complemented and empowered outsourcing. At earlier stages, there were some privacy concerns associated with Cloud Computing as the owners of data had to give the custody of their data to the Cloud owners. However, as time passed, with confidence building measures by Cloud owners, the technology became so prevalent that most of the world’s SMEs started using it in one or the other form. More information on Cloud Computing can be found in [ 11 , 12 ].

3.1 Fog computing

As faster processing became the need for some critical applications, the clouds regenerated Fog or Edge computing. As can be seen in Gartner hyper cycles in Figs.  2 and 3 , Edge computing, as an emerging technology, has also peaked in 2017–18. As shown in the Cloud Computing architecture in Fig.  4 , the middle or second layers of the cloud configuration are represented by the Fog computing. For some applications delay in communication between the computing devices in the field and data in a Cloud (often physically apart by thousands of miles), is detrimental of the time requirements, as it may cause considerable delay in time sensitive applications. For example, processing and storage for early warning of disasters (stampedes, Tsunami, etc.) must be in real time. For these kinds of applications, computing and storing resources should be placed closer to where computing is needed (application areas like digital street). In these kind of scenarios Fog computing is considered to be suitable [ 13 ]. Clouds are integral part of many IoT applications and play central role on ubiquitous computing systems in health related cases like the one depicted in Fig.  5 . Some applications of Fog computing can be found in [ 14 , 15 , 16 ]. More results on Fog computing are also available in [ 17 , 18 , 19 ].

figure 2

Emerging Technologies 2018

figure 3

Emerging Technologies 2017

figure 4

Relationship of Cloud, Fog and Dew computing 

figure 5

Snapshot of a Ubiquitous System

3.2 Dew computing

When Fog is overloaded and is not able to cater for the peaks of high demand applications, it offloads some of its data and/or processing to the associated cloud. In such a situation, Fog exposes its dependency to a complementary bottom layer of the cloud architectural organisation as shown in the Cloud architecture of Fig.  4 . This bottom layer of hierarchical resources organization is known as the Dew layer. The purpose of the Dew layer is to cater for the tasks by exploiting resources near to the end-user with minimum internet access [ 17 , 20 ]. As a feature, Dew computing takes care of determining as to when to use for its services linking with the different layers of the Cloud architecture. It is also important to note that the Dew computing [ 20 ] is associated with the distributed computing hierarchy and is integrated by the Fog computing services, which is also evident in Fig.  4 . In summary, Cloud architecture has three layers, first being Cloud, second as Fog and the third Dew.

4 Internet of things

Definition of Internet of Things (IoT), as depicted in Fig.  6 , has been changing with the passage of time. With growing number of internet based applications, which use many technologies, devices and tools, one would think, the name of IoT seems to have evolved. Accordingly, things (technologies, devices and tools) used together in internet based applications to generate data to provide assistance and services to the users from anywhere, at any time. The internet can be considered as a uniform technology from any location as it provides the same service of ‘connectivity’. The speed and security however are not uniform. The IoT as an emerging technology has peaked during 2017–18 as is evident from Figs.  2 and 3 . This technology is expanding at a very fast rate. According to [ 21 , 22 , 23 , 24 ], the number of IoT devices could be in millions by the year 2021.

figure 6

Internet of Things

IoT is providing some amazing applications in tandem with wearable devices, sensor networks, Fog computing, and other technologies to improve some the critical facets of our lives like healthcare management, service delivery, and business improvements. Some applications of IoT in the field of crowd management are discussed in [ 14 ]. Some applications in of IoT in the context of privacy and security are discussed in [ 15 , 16 ]. Some of the key devices and associated technologies to IoT include RFID Tags [ 25 ], Internet, computers, cameras, RFID, Mobile Devices, coloured lights, RFIDs, Sensors, Sensor networks, Drones, Cloud, Fog and Dew.

5 Applications of blockchain

Blockchain is usually associated with Cryptocurrencies like Bitcoin (Currently, there are over one and a half thousand cryptocurrencies and the numbers are still rising). But the Blockchain technology can also be used for many more critical applications of our daily lives. The Blockchain is a distributed ledger technology in the form of a distributed transactional database, secured by cryptography, and governed by a consensus mechanism. A Blockchain is essentially a record of digital events [ 26 ]. A block represents a completed transaction or ledger. Subsequent and prior blocks are chained together, displaying the status of the most recent transaction. The role of chain is to provide linkage between records in a chronological order. This chain continues to grow as and when further transactions take place, which are recorded by adding new blocks to the chain. User security and ledger consistency in the Blockchain is provided by Asymmetric cryptography and distributed consensus algorithms. Once a block is created, it cannot be altered or removed. The technology eliminates the need for having a bank statement for verification of the availability of funds or that of a lawyer for certifying the occurrence of an event. The benefits of Blockchain technology are inherited in its characteristics of decentralization, persistency, anonymity and auditability [ 27 , 28 ].

5.1 Blockchain for business use

Blockchain, being the technology behind Cryptocurrencies, started as an open-source Bitcoin community to allow reliable peer-to-peer financial transactions. Blockchain technology has made it possible to build a globally functional currency relying on code, without using any bank or third-party platforms [ 28 ]. These features have made the Blockchain technology, secure and transparent for business transactions of any kind involving any currencies. In literature, we find many applications of Blockchain. Nowadays, the applications of Blockchain technology involve various kinds of transactions requiring verification and automated system of payments using smart contracts. The concept of Smart Contacts [ 28 ] has virtually eliminated the role of intermediaries. This technology is most suitable for businesses requiring high reliability and honesty. Because of its security and transparency features, the technology would benefit businesses trying to attract customers. Blockchain can be used to eliminate the occurrence of fake permits as can be seen in [ 29 ].

5.2 Blockchain for healthcare management

As discussed above, Blockchain is an efficient and transparent way of digital record keeping. This feature is highly desirable in efficient healthcare management. Medical field is still undergoing to manage their data efficiently in a digital form. As usual the issues of disparate and non-uniform record storage methods are hampering the digitization, data warehouse and big data analytics, which would allow efficient management and sharing of the data. We learn the magnitude of these problem from examples of such as the target of the National Health Service (NHS) of the United Kingdom to digitize the UK healthcare is by 2023 [ 30 ]. These problems lead to inaccuracies of data which can cause many issues in healthcare management, including clinical and administrative errors.

Use of Blockchain in healthcare can bring revolutionary improvements. For example, smart contracts can be used to make it easier for doctors to access patients’ data from other organisations. The current consent process often involves bureaucratic processes and is far from being simplified or standardised. This adds to many problems to patients and specialists treating them. The cost associated with the transfer of medical records between different locations can be significant, which can virtually be reduced to zero by using Blockchain. More information on the use of Blockchain in the healthcare data can be found in [ 30 , 31 ].

6 Environment cleaning robot

One of the ongoing healthcare issue is the eradication of deadly viruses and bacteria from hospitals and healthcare units. Nosocomial infections are a common problem for hospitals and currently they are treated using various techniques [ 32 , 33 ]. Historically, cleaning the hospital wards and operating rooms with chlorine has been an effective way. On the face of some deadly viruses like EBOLA, HIV Aids, Swine Influenza H1N1, H1N2, various strands of flu, Severe Acute Respiratory Syndrome (SARS) and Middle Eastern Respiratory Syndrome (MERS), there are dangerous implications of using this method [ 14 ]. An advanced approach is being used in the USA hospitals, which employs “robots” to purify the space as can be seen in [ 32 , 33 ]. However, certain problems exist within the limitations of the current “robots”. Most of these devices require a human to place them in the infected areas. These devices cannot move effectively (they just revolve around themselves); hence, the UV light will not reach all areas but only a very limited area within the range of the UV light emitter. Finally, the robot itself maybe infected as the light does not reach most of the robot’s surfaces. Therefore, there is an emerging need to build a robot that would not require the physical presence of humans to handle it, and could purify the entire room by covering all the room surfaces with UV light while, at the same time, will not be infected itself.

Figure  7 is an overview of the design of a fully motorized spider robot with six legs. This robot supports Wi-Fi connectivity for the purpose of control and be able to move around the room and clean the entire area. The spider design will allow the robot to move in any surface, including climbing steps but most importantly the robot will use its legs to move the UV light emitter as well as clear its body before leaving the room. This substantially reduces the risk of the robot transmitting any infections.

figure 7

Spider Robot for virus cleaning

Additionally, the robot will be equipped with a motorized camera allowing the operator to monitor space and stop the process of emitting UV light in case of unpredicted situations. The operator can control the robot via a networked graphical user interface and/or from an augmented reality environment which will utilize technologies such as the Oculus Touch. In more detail, the user will use the oculus rift virtual reality helmet and the oculus touch, as well as hand controllers to remote control the robot. This will provide the user with the vision of the robot in a natural manner. It will also allow the user to control the two front robotic arms of the spider robot via the oculus touch controller, making it easy to do conduct advance movements, simply by move the hands. The physical movements of the human hand will be captured by the sensors of oculus touch and transmitted to the robot. The robot will then use reverse kinematics to translate the actions and position of the human hand to movements of the robotic arm. This technique will also be used during the training phase of the robot, where the human user will teach the robot how to clean various surfaces and then purify itself, simply by moving their hands accordingly. The design of the spider robot was proposed in a project proposal submitted to the King Abdulaziz City of Science and Technology ( https://www.kacst.edu.sa/eng/Pages/default.aspx ) by the author and George Tsaramirsis ( https://www.researchgate.net/profile/George_Tsaramirsis ).

7 Conclusions

We have presented details of some of the emerging technologies and real life application, that are providing businesses remarkable opportunities, which were previously unthinkable. Businesses are continuously trying to increase the use of new technologies and tools to improve processes, to benefit their client. The IoT and associated technologies are now able to provide real time and ubiquitous processing to eliminate the need for human surveillance. Similarly, Virtual Reality, Artificial Intelligence robotics are having some remarkable applications in the field of medical surgeries. As discussed, with the help of the technology, we now can predict and mitigate some natural disasters such as stampedes with the help of sensor networks and other associated technologies. Finally, the increase in Big Data Analytics is influencing businesses and government agencies with smarter decision making to achieve targets or expectations.

Naughton John (2016) The evolution of the internet: from military experiment to general purpose technology. J Cyber Policy 1(1):5–28. https://doi.org/10.1080/23738871.2016.1157619

Article   Google Scholar  

Gareth Mitchell (2019). How much data is on the internet? Science Focus (The Home of BBC Science Focus Magazine). [Online]. https://www.sciencefocus.com/future-technology/how-much-data-is-on-the-internet/ . Accessed 20 April 2019

Mae Rice (2018). The deep web is the 99% of the internet you can’t google. Curiosity. [Online]. https://curiosity.com/topics/the-deep-web-is-the-99-of-the-internet-you-cant-google-curiosity/ . Accessed 20 April 2019

Slumkoski C (2012) History on the internet 2.0: the rise of social media. Acadiensis 41(2):153–162 (Summer/Autumn-ÉTÉ/Automne)

Google Scholar  

Ibarra-Esquer JE, González-Navarro FF, Flores-Rios BL, Burtseva L, María A, Astorga-Vargas M (2017) Tracking the evolution of the internet of things concept across different application domains. Sensors (Basel) 17(6):1379. https://doi.org/10.3390/s17061379

Misra Biswapriya B, Langefeld Carl, Olivier Michael, Cox Laura A (2018) Integrated omics: tools, advances and future approaches. J Mol Endocrinol 62(1):R21–R45. https://doi.org/10.1530/JME-18-0055

Lee Choong Ho, Yoon Hyung-Jin (2017) Medical big data: promise and challenges. Kidney Res Clin Practice 36(1):3–13. https://doi.org/10.23876/j.krcp.2017.36.1.3

Faggella D (2019). Where healthcare’s big data actually comes from. Emerj. [Onlone]. Last accessed from https://emerj.com/ai-sector-overviews/where-healthcares-big-data-actually-comes-from/

Sinha A, Hripcsak G, Markatou M (2009) Large datasets in biomedicine: a discussion of salient analytic issues. J Am Med Inform Assoc 16:759–767. https://doi.org/10.1197/jamia.M2780

Keith D. Foote a brief history of cloud computing. DATAVERSITY, 2017. {Online]. Last accessed on 5/5/2019 from A Brief History of Cloud Computing

Vassakis K, Petrakis E, Kopanakis I (2018) Big data analytics: applications, prospects and challenges. In: Skourletopoulos G, Mastorakis G, Mavromoustakis C, Dobre C, Pallis E (eds) Mobile big data. Lecture notes on data engineering and communications technologies, vol 10. Springer, Cham

Yamin M, Al Makrami AA (2015) Cloud computing in SMEs: case of Saudi Arabia. BIJIT—BVICAM’s Int J Inform Technol 7(1):853–860

Ahmed E, Ahmed A, Yaqoob I, Shuja J, Gani A, Imran M, Shoaib M (2017) Bringing computation closer toward the user network: is edge computing the solution? IEEE Commun Mag 55:138–144

Yamin M, Basahel AM, Abi Sen AA (2018) Managing crowds with wireless and mobile technologies. Hindawi. Wireless Commun Mobile Comput. Volume 2018, Article ID 7361597, pp 15. https://doi.org/10.1155/2018/7361597

Yamin M, Abi Sen AA (2018) Improving privacy and security of user data in location based services. Int J Ambient Comput Intell 9(1):19–42. https://doi.org/10.4018/IJACI.2018010102

Sen AAA, Eassa FA, Jambi K, Yamin M (2018) Preserving privacy in internet of things—a survey. Int J Inform Technol 10(2):189–200. https://doi.org/10.1007/s41870-018-0113-4

Longo Mathias, Hirsch Matías, Mateos Cristian, Zunino Alejandro (2019) Towards integrating mobile devices into dew computing: a model for hour-wise prediction of energy availability. Information 10(3):86. https://doi.org/10.3390/info10030086

Nunna S, Kousaridas A, Ibrahim M, Dillinger M, Thuemmler C, Feussner H, Schneider A Enabling real-time context-aware collaboration through 5G and mobile edge computing. In: Proceedings of the 12th international conference on information technology-new generations, Las Vegas, NV, USA, 13–15 April 2015; pp 601–605

Vaquero LM, Rodero-Merino L (2014) Finding your way in the fog: towards a comprehensive definition of fog computing. SIGCOMM Comput Commun Rev 44:27–32

Ray PP (2019) Minimizing dependency on internetwork: Is dew computing a solution? Trans Emerg Telecommun Technol 30:e3496

Bonomi F, Milito R, Zhu J, Addepalli S Fog computing and its role in the internet of things. In: Proceedings of the first edition of the workshop on mobile cloud computing, Helsinki, Finland, 17 August 2012; pp 13–16. [Google Scholar]

Jia X, Feng Q, Fan T, Lei Q, RFID technology and its applications in internet of things (IoT), consumer electronics, communications and networks (CECNet). In: 2nd international conference proceedings, pp 1282–1285. IEEE, 2012, https://doi.org/10.1109/cecnet.2012.6201508

Said O, Masud M (2013) Towards internet of things: survey and future vision. Int J Comput Netw 5(1):1–17

Gubbi J, Buyya R, Marusic S, Palaniswami M (2013) Internet of things (IoT): a vision, architectural elements, and future directions. Future Gener Comput Syst 29(7):1645–1660. https://doi.org/10.1016/j.future.2013.01.010

Beck R, Avital M, Rossi M et al (2017) Blockchain technology in business and information systems research. Bus Inf Syst Eng 59:381. https://doi.org/10.1007/s12599-017-0505-1

Yamin M (2018) Managing crowds with technology: cases of Hajj and Kumbh Mela. Int J Inform Technol. https://doi.org/10.1007/s41870-018-0266-1

Zheng Z, Xie S, Dai H, Chen X, Wang H An overview of blockchain technology: architecture, consensus, and future trends. https://www.researchgate.net/publication/318131748_An_Overview_of_Blockchain_Technology_Architecture_Consensus_and_Future_Trends . Accessed May 01 2019

Al-Saqafa W, Seidler N (2017) Blockchain technology for social impact: opportunities and challenges ahead. J Cyber Secur Policy. https://doi.org/10.1080/23738871.2017.1400084

Alotaibi M, Alsaigh M, Yamin M (2019) Blockchain for controlling Hajj and Umrah permits. Int J Comput Sci Netw Secur 19(4):69–77

Vazirani AA, O’Donoghue O, Brindley D (2019) Implementing blockchains for efficient health care: systematic review. J Med Internet Res 21(2):12439. https://doi.org/10.2196/12439

Yamin M (2018) IT applications in healthcare management: a survey. Int J Inform Technol 10(4):503–509. https://doi.org/10.1007/s41870-018-0203-3

Begić A. (2018) Application of Service Robots for Disinfection in Medical Institutions. In: Hadžikadić M., Avdaković S. (eds) Advanced Technologies, Systems, and Applications II. IAT (2017) Lecture Notes in Networks and Systems, vol 28. Springer, Cham

Mettler T, Sprenger M, Winter R (2017) Service robots in hospitals: new perspectives on niche evolution and technology affordances. Euro J Inform Syst. 10:11. https://doi.org/10.1057/s41303-017-0046-1

Download references

Author information

Authors and affiliations.

Department of MIS, Faculty of Economics and Admin, King Abdulaziz University, Jeddah, Saudi Arabia

Mohammad Yamin

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Mohammad Yamin .

Rights and permissions

Reprints and permissions

About this article

Yamin, M. Information technologies of 21st century and their impact on the society. Int. j. inf. tecnol. 11 , 759–766 (2019). https://doi.org/10.1007/s41870-019-00355-1

Download citation

Received : 05 May 2019

Accepted : 09 August 2019

Published : 16 August 2019

Issue Date : December 2019

DOI : https://doi.org/10.1007/s41870-019-00355-1

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Emerging and future technologies
  • Internet of things
  • Sensor networks
  • Location based services
  • Mobile digital platforms
  • Find a journal
  • Publish with us
  • Track your research

Home — Essay Samples — Information Science and Technology — Technology in Education — The Role of Technology in Education

test_template

The Role of Technology in Education

  • Categories: Children and Technology Technology in Education

About this sample

close

Words: 689 |

Published: Jun 24, 2024

Words: 689 | Pages: 2 | 4 min read

Image of Alex Wood

Cite this Essay

To export a reference to this article please select a referencing style below:

Let us write you an essay from scratch

  • 450+ experts on 30 subjects ready to help
  • Custom essay delivered in as few as 3 hours

Get high-quality help

author

Prof Ernest (PhD)

Verified writer

  • Expert in: Information Science and Technology

writer

+ 120 experts online

By clicking “Check Writers’ Offers”, you agree to our terms of service and privacy policy . We’ll occasionally send you promo and account related email

No need to pay just yet!

Related Essays

8 pages / 3531 words

3 pages / 1418 words

1 pages / 504 words

1 pages / 590 words

Remember! This is just a sample.

You can get your custom paper by one of our expert writers.

121 writers online

Still can’t find what you need?

Browse our vast selection of original essay samples, each expertly formatted and styled

Related Essays on Technology in Education

The rapid evolution of technology has brought us to an era where free internet access in the world is no longer just a luxury but a necessity. The internet has transformed various aspects of our lives, from communication and [...]

With the rise of technology and the internet, children today have access to an unprecedented amount of information at their fingertips. This has sparked a debate among educators and parents about whether the internet is making [...]

In the contemporary world, technology has become an omnipresent force, shaping various facets of our lives, including education. The integration of technology in education is no longer a futuristic concept but a present reality [...]

The integration of technology into educational systems has sparked extensive debate among educators, policymakers, and stakeholders. Proponents argue that technology enhances learning experiences, fosters engagement, and [...]

The author's passion for technology and the impact of technology on the world Interest in problem-solving and its connection to technology Pursuing a diploma course in technology after secondary school Work [...]

It is a reality that advancement of Information Technology has revolutionized the business practices and strategies of entire industries. The field of higher education is not an exception to this phenomenon. Colleges and [...]

Related Topics

By clicking “Send”, you agree to our Terms of service and Privacy statement . We will occasionally send you account related emails.

Where do you want us to send this sample?

By clicking “Continue”, you agree to our terms of service and privacy policy.

Be careful. This essay is not unique

This essay was donated by a student and is likely to have been used and submitted before

Download this Sample

Free samples may contain mistakes and not unique parts

Sorry, we could not paraphrase this essay. Our professional writers can rewrite it and get you a unique paper.

Please check your inbox.

We can write you a custom essay that will follow your exact instructions and meet the deadlines. Let's fix your grades together!

Get Your Personalized Essay in 3 Hours or Less!

We use cookies to personalyze your web-site experience. By continuing we’ll assume you board with our cookie policy .

  • Instructions Followed To The Letter
  • Deadlines Met At Every Stage
  • Unique And Plagiarism Free

modern technology in the 21st century essay

IMAGES

  1. Tech In 21st Cen

    modern technology in the 21st century essay

  2. Modern Technology Essay Example

    modern technology in the 21st century essay

  3. 📌 Free Essay on the Role of Modern Technology in Communication

    modern technology in the 21st century essay

  4. Use of Modern Technology Essay Example

    modern technology in the 21st century essay

  5. Technology in the 21st Century

    modern technology in the 21st century essay

  6. MUET Essay 1 ( Technology)

    modern technology in the 21st century essay

VIDEO

  1. India in the 21st Century

  2. Information Technology Essay writing in English..Short Essay on Technology Information in 150 words

  3. | India in 21st Century odia essay

  4. Technology 21st Century Context

  5. 21st Century Tools for Teachers Lesson Preview

  6. Advantages and disadvantages of technology paragraph

COMMENTS

  1. How Is Technology Changing the World, and How Should the World Change

    Scholars have even contended that the emergence of the term technology in the nineteenth and twentieth centuries marked a shift from viewing individual pieces of machinery as a means to achieving political and social progress to the more dangerous, or hazardous, view that larger-scale, more complex technological systems were a semiautonomous ...

  2. Technology In 21st Century (Essay Sample) 2023

    The paper views technology in the 21st century. Technology has played a crucial role towards enhancement of globalization in the 21st century. Globalization had huge impacts on the economic world, through an array of merits and demerits arising from globalization acts. New technological trends have played a fundamental role in making a rapid ...

  3. Here's how technology has changed the world since 2000

    Since the dotcom bubble burst back in 2000, technology has radically transformed our societies and our daily lives. From smartphones to social media and healthcare, here's a brief history of the 21st century's technological revolution. Just over 20 years ago, the dotcom bubble burst, causing the stocks of many tech firms to tumble.

  4. Essay on Modern Technology

    In conclusion, modern technology, with its profound impact on communication, daily life, work, and play, is an undeniable force shaping the 21st-century human experience. 500 Words Essay on Modern Technology Introduction to Modern Technology. In the contemporary era, modern technology has emerged as a significant facet of human life.

  5. 21 Most Important Inventions of the 21st Century

    Click here to see the 21 most important inventions of the 21st century. Justin Sullivan / Getty Images. 1. 3D printing. Most inventions come as a result of previous ideas and concepts, and 3D ...

  6. The Evolution of Technology in K-12 Classrooms: 1659 to Today

    00:00. In the 21st century, it can feel like advanced technology is changing the K-12 classroom in ways we've never seen before. But the truth is, technology and education have a long history of evolving together to dramatically change how students learn. With more innovations surely headed our way, why not look back at how we got to where ...

  7. The Evolution of Technology: [Essay Example], 640 words

    The Digital Revolution: In the 21st century, technology has reshaped every facet of our lives. This essay delves into the profound impact of the digital revolution, from smartphones to artificial intelligence, and how it continues to shape our world. ... Ethical Issues Raised By Prenatal DNA Sequencing Essay. Modern technology enables us to ...

  8. How artificial intelligence is transforming the world

    promote new models of digital education and AI workforce development so employees have the skills needed in the 21 st-century economy, create a federal AI advisory committee to make policy ...

  9. How Technology Has Changed Our Lives

    Hook Examples for Technology Essay. A Digital Revolution: Enter the era of smartphones, AI, and the Internet of Things, where technology is the driving force. Join me as we explore how technology has transformed our lives and the profound impact it has on society. An Intriguing Quote: Arthur C. Clarke once said, "Any sufficiently advanced ...

  10. Science, technology and innovation in a 21st century context

    Science, technology and innovation in a 21st century context. This editorial essay was prepared by John H. "Jack" Marburger for a workshop on the "science of science and innovation policy" held in 2009 that was the basis for this special issue. It is published posthumously. Linking the words "science," "technology," and ...

  11. THE IMPACT OF TECHNOLOGY ON HIGHER EDUCATION IN THE 21 st CENTURY: A

    While technology evolves rapidly in the 21st century, higher education institutions sometimes struggle to align with the changing requirements of the labor market (Karakolis et al., 2022). The ubiquity of information and communication technologies (ICTs) has significantly changed the procedures and courses in higher education institutions ...

  12. How Technology Is Changing the Future of Higher Education

    Tony Cenicola/The New York Times. This article is part of our latest Learning special report. We're focusing on Generation Z, which is facing challenges from changing curriculums and new ...

  13. Time to Rethink: Educating for a Technology-Transformed World

    We present information on how technology is transforming virtually every aspect of our lives and the threats we face from social media, climate change, and growing inequality. We then analyze the adequacy of proposals for teaching new skills, such as 21st-Century Skills, to prepare students for a world of work that is changing at warp speed.

  14. 10 Breakthrough Technologies 2021

    10 Breakthrough Technologies 2021. Sierra & Lenny. by. the Editors. February 24, 2021. This list marks 20 years since we began compiling an annual selection of the year's most important ...

  15. PDF Science, technology and innovation in a 21st century context

    Science, technology and innovation in a 21st century context John H. Marburger III Springer Science+Business Media, LLC. 2011 This editorial essay was prepared by John H. ''Jack'' Marburger for a workshop on the ''science of science and innovation policy'' held in 2009 that was the basis for this special issue.

  16. 21st Century Communication Technology Essay

    Get a custom essay on 21st Century Communication Technology. The most common forms of technology that have been used over the period of time for communication in a company pertain to face to face communication, memos, letters, bulletin boards as well as financial reports. The selection of type of media is based on the purpose of the ...

  17. Technology In The 21st Century Essay

    Technology In The 21st Century Essay. Over the past centuries there had been changing in the social, economic and even political aspects of the world but when the 21st century or also known as Industrial Age came in, the changes became more common because of the development of technology. In addition, due to the wide developments of technology ...

  18. Smartphone innovation in the third decade of the 21st century

    Smartphone innovation in the third decade of the 21st century. Mobile devices have come a long way, building on their traditional phone function to become general-purpose communication consoles ...

  19. Information technologies of 21st century and their impact on the

    Twenty first century has witnessed emergence of some ground breaking information technologies that have revolutionised our way of life. The revolution began late in 20th century with the arrival of internet in 1995, which has given rise to methods, tools and gadgets having astonishing applications in all academic disciplines and business sectors. In this article we shall provide a design of a ...

  20. The Role of Technology in Education: [Essay Example], 689 words

    By promoting collaboration and communication, technology can help students develop essential 21st-century skills, such as teamwork, communication, and problem-solving, which are vital for success in the modern workforce. Despite its numerous advantages, the integration of technology into education also raises concerns that must be addressed.

  21. Technology in the 21st century: New challenges and opportunities

    The surveyed literature was comprised of English-written peer-reviewed papers on big-data-related topics covering a 16-year period from 2000 to 2015. We set up the review from the turn of this century, when big-data-related concepts started to gain prominence, although big data as a management-related term first gathered steam around the year ...

  22. Modern Technology And Modern Education In The 21st Century

    2.4.1 Modern Education. In our current generation, technology is rampant and widely used by people all around the world. Technology even expanded in the field of education, where mostly modern teachers used it as a medium for teaching their students. Since most novice or inexperienced teachers are born in the 21st century and most are ...