Internet Timeline


1969
ARPA (Advanced Research Projects Agency) goes online in December, connecting four major U.S. universities. Designed for research, education, and government organizations, it provides a communications network linking the country in the event that a military attack destroys conventional communications systems.

1972
Electronic mail is introduced by Ray Tomlinson, a Cambridge, Mass., computer scientist. He uses the @ to distinguish between the sender's name and network name in the email address.

1973
Transmission Control Protocol/Internet Protocol (TCP/IP) is designed and in 1983 it becomes the standard for communicating between computers over the Internet. One of these protocols, FTP (File Transfer Protocol), allows users to log onto a remote computer, list the files on that computer, and download files from that computer.

1976
Presidential candidate Jimmy Carter and running mate Walter Mondale use email to plan campaign events.
Queen Elizabeth sends her first email. She's the first state leader to do so.

1982
The word “Internet” is used for the first time.

1984
Domain Name System (DNS) is established, with network addresses identified by extensions such as .com, .org, and .edu.
Writer William Gibson coins the term “cyberspace.”

1985
Quantum Computer Services, which later changes its name to America Online, debuts. It offers email, electronic bulletin boards, news, and other information.

1988
A virus called the Internet Worm temporarily shuts down about 10% of the world's Internet servers.

1989
The World (world.std.com) debuts as the first provider of dial-up Internet access for consumers.
Tim Berners-Lee of CERN (European Laboratory for Particle Physics) develops a new technique for distributing information on the Internet. He calls it the World Wide Web. The Web is based on hypertext, which permits the user to connect from one document to another at different sites on the Internet via hyperlinks (specially programmed words, phrases, buttons, or graphics). Unlike other Internet protocols, such as FTP and email, the Web is accessible through a graphical user interface.

1990
The first effort to index the Internet is created by Peter Deutsch at McGill University in Montreal, who devises Archie, an archive of FTP sites.

1991
Gopher, which provides point-and-click navigation, is created at the University of Minnesota and named after the school mascot. Gopher becomes the most popular interface for several years.
Another indexing system, WAIS (Wide Area Information Server), is developed by Brewster Kahle of Thinking Machines Corp.

1993
Mosaic is developed by Marc Andreeson at the National Center for Supercomputing Applications (NCSA). It becomes the dominant navigating system for the World Wide Web, which at this time accounts for merely 1% of all Internet traffic.

1994
The White House launches its website, www.whitehouse.gov.
Initial commerce sites are established and mass marketing campaigns are launched via email, introducing the term “spamming” to the Internet vocabulary.
Marc Andreessen and Jim Clark start Netscape Communications. They introduce the Navigator browser.

1995
CompuServe, America Online, and Prodigy start providing dial-up Internet access.
Sun Microsystems releases the Internet programming language called Java.
The Vatican launches its own website, www.vatican.va.

1996
Approximately 45 million people are using the Internet, with roughly 30 million of those in North America (United States and Canada), 9 million in Europe, and 6 million in Asia/Pacific (Australia, Japan, etc.). 43.2 million (44%) U.S. households own a personal computer, and 14 million of them are online.

1997
On July 8, 1997, Internet traffic records are broken as the NASA website broadcasts images taken by Pathfinder on Mars. The broadcast generates 46 million hits in one day.
The term “weblog” is coined. It’s later shortened to “blog.”

1998
Google opens its first office, in California.

1999
College student Shawn Fanning invents Napster, a computer application that allows users to swap music over the Internet. The number of Internet users worldwide reaches 150 million by the beginning of 1999. More than 50% are from the United States. “E-commerce” becomes the new buzzword as Internet shopping rapidly spreads.
MySpace.com is launched.

2000
To the chagrin of the Internet population, deviant computer programmers begin designing and circulating viruses with greater frequency. “Love Bug” and “Stages” are two examples of self-replicating viruses that send themselves to people listed in a computer user's email address book. The heavy volume of email messages being sent and received forces many infected companies to temporarily shut down their clogged networks. The Internet bubble bursts, as the fountain of investment capital dries up and the Nasdaq stock index plunges, causing the initial public offering (IPO) window to slam shut and many dotcoms to close their doors.
America Online buys Time Warner for $16 billion. It’s the biggest merger of all time.

2001
Napster is dealt a potentially fatal blow when the 9th U.S. Circuit Court of Appeals in San Francisco rules that the company is violating copyright laws and orders it to stop distributing copyrighted music. The file-swapping company says it is developing a subscription-based service. About 9.8 billion electronic messages are sent daily.
Wikipedia is created.

2002
As of January, 58.5% of the U.S. population (164.14 million people) uses the Internet. Worldwide there are 544.2 million users. The death knell tolls for Napster after a bankruptcy judge ruled in September that German media giant Bertelsmann cannot buy the assets of troubled Napster Inc. The ruling prompts Konrad Hilbers, Napster CEO, to resign and lay off his staff.

2003
It's estimated that Internet users illegally download about 2.6 billion music files each month. Spam, unsolicited email, becomes a server-clogging menace. It accounts for about half of all emails. In December, President Bush signs the Controlling the Assault of Non-Solicited Pornography and Marketing Act of 2003 (CAN-SPAM Act), which is intended to help individuals and businesses control the amount of unsolicited email they receive. Apple Computer introduces Apple iTunes Music Store, which allows people to download songs for 99 cents each.
Spam, unsolicited email, becomes a server-clogging menace. It accounts for about half of all emails.
Apple Computer introduces Apple iTunes Music Store, which allows people to download songs for 99 cents each.

2004
Internet Worm, called MyDoom or Novarg, spreads through Internet servers. About 1 in 12 email messages are infected.
Online spending reaches a record high—$117 billion in 2004, a 26% increase over 2003.

2005
YouTube.com is launched.

2006
There are more than 92 million websites online.

2007
Legal online music downloads triple to 6.7 million downloads per week.
Colorado Rockies' computer system crashes when it receives 8.5 million hits within the first 90 minutes of World Series ticket sales.
The online game, World of Warcraft, hits a milestone when it surpasses 9 million subscribers worldwide in July.

2008
In a move to challenge Google's dominance of search and advertising on the Internet, software giant Microsoft offers to buy Yahoo for $44.6 billion.
In a San Fransisco federal district court, Judge Jeffrey S. White orders the disabling of Wikileaks.org, a Web site that discloses confidential information. The case was brought by Julius Baer Bank and Trust, located in the Cayman Islands, after a disgruntled ex-employee allegedly provided Wikileaks with stolen documents that implicate the bank in asset hiding, money laundering, and tax evasion. Many web communities, who see the ruling as unconstitutional, publicized alternate addresses for the site and distributed bank documents through their own networks. In response, Judge White issues another order to stop the distribution of bank documents.
Microsoft is fined $1.3 billion by the European Commission for further abusing its dominant market position, and failing to comply to their 2004 judgment, which ordered Microsoft to give competitors information necessary to operate with Windows. Since 2004, Microsoft has been fined a total of $2.5 billion by the Commission for not adhering to their ruling.

Internet Scams: Don't Believe Everything You

How Big Is a Bit?

bit: The word "bit" is short for "binary digit." A bit is the smallest piece of computer information.
byte: Most computers use combinations of eight bits, called bytes, to represent one character of data. For example, the word "cat" has three characters, and it would be represented by three bytes.
kilobyte (K or KB): A kilobyte is equal to 1,024 bytes. megabyte (MB) A megabyte is equal to 1,048,576 bytes, but it is usually rounded off to one million bytes.
gigabyte (GB): A gigabyte is one thousand megabytes.
terabyte (TB): A terabyte is one trillion bytes, or 1000 gigabytes.
Computer memory is usually measured in megabytes or gigabytes. This tells how much information your computer can store.
The speed of a modem (a device that connects two computers over a telephone line) is measured in bits per second, or bps. This tells how much information can be sent in a second.

Computer Virus TimelineComputers and the Internet

Computer Virus Timeline

1949
Theories for self-replicating programs are first developed.

1981
Apple Viruses 1, 2, and 3 are some of the first viruses “in the wild,” or in the public domain. Found on the Apple II operating system, the viruses spread through Texas A&M via pirated computer games.

1983
Fred Cohen, while working on his dissertation, formally defines a computer virus as “a computer program that can affect other computer programs by modifying them in such a way as to include a (possibly evolved) copy of itself.”

1986
Two programmers named Basit and Amjad replace the executable code in the boot sector of a floppy disk with their own code designed to infect each 360kb floppy accessed on any drive. Infected floppies had “© Brain” for a volume label.

1987
The Lehigh virus, one of the first file viruses, infects command.com files.

1988
One of the most common viruses, Jerusalem, is unleashed. Activated every Friday the 13th, the virus affects both .exe and .com files and deletes any programs run on that day.
MacMag and the Scores virus cause the first major Macintosh outbreaks.

1990
Symantec launches Norton AntiVirus, one of the first antivirus programs developed by a large company.

1991
Tequila is the first widespread polymorphic virus found in the wild. Polymorphic viruses make detection difficult for virus scanners by changing their appearance with each new infection.

1992
1300 viruses are in existence, an increase of 420% from December of 1990.
The Dark Avenger Mutation Engine (DAME) is created. It is a toolkit that turns ordinary viruses into polymorphic viruses. The Virus Creation Laboratory (VCL) is also made available. It is the first actual virus creation kit.

1994
Good Times email hoax tears through the computer community. The hoax warns of a malicious virus that will erase an entire hard drive just by opening an email with the subject line “Good Times.” Though disproved, the hoax resurfaces every six to twelve months.

1995
Word Concept becomes one of the most prevalent viruses in the mid-1990s. It is spread through Microsoft Word documents.

1996
Baza, Laroux (a macro virus), and Staog viruses are the first to infect Windows95 files, Excel, and Linux respectively.

1998
Currently harmless and yet to be found in the wild, StrangeBrew is the first virus to infect Java files. The virus modifies CLASS files to contain a copy of itself within the middle of the file's code and to begin execution from the virus section.
The Chernobyl virus spreads quickly via .exe files. As the notoriety attached to its name would suggest, the virus is quite destructive, attacking not only files but also a certain chip within infected computers.
Two California teenagers infiltrate and take control of more than 500 military, government, and private sector computer systems.

1999
The Melissa virus, W97M/Melissa, executes a macro in a document attached to an email, which forwards the document to 50 people in the user's Outlook address book. The virus also infects other Word documents and subsequently mails them out as attachments. Melissa spread faster than any previous virus, infecting an estimated 1 million PCs.
Bubble Boy is the first worm that does not depend on the recipient opening an attachment in order for infection to occur. As soon as the user opens the email, Bubble Boy sets to work.
Tristate is the first multi-program macro virus; it infects Word, Excel, and PowerPoint files.

2000
The Love Bug, also known as the ILOVEYOU virus, sends itself out via Outlook, much like Melissa. The virus comes as a VBS attachment and deletes files, including MP3, MP2, and .JPG. It also sends usernames and passwords to the virus's author.
W97M.Resume.A, a new variation of the Melissa virus, is determined to be in the wild. The “resume” virus acts much like Melissa, using a Word macro to infect Outlook and spread itself.
The “Stages” virus, disguised as a joke email about the stages of life, spreads across the Internet. Unlike most previous viruses, Stages is hidden in an attachment with a false “.txt” extension, making it easier to lure recipients into opening it. Until now, it has generally been safe to assume that text files are safe.
“Distributed denial-of-service” attacks by hackers knock Yahoo, eBay, Amazon, and other high profile web sites offline for several hours.

2001
Shortly after the September 11th attacks, the Nimda virus infects hundreds of thousands of computers in the world. The virus is one of the most sophisticated to date with as many as five different methods of replicating and infecting systems. The “Anna Kournikova” virus, which mails itself to persons listed in the victim's Microsoft Outlook address book, worries analysts who believe the relatively harmless virus was written with a “tool kit” that would allow even the most inexperienced programmers to create viruses. Worms increase in prevalence with Sircam, CodeRed, and BadTrans creating the most problems. Sircam spreads personal documents over the Internet through email. CodeRed attacks vulnerable webpages, and was expected to eventually reroute its attack to the White House homepage. It infected approximately 359,000 hosts in the first twelve hours. BadTrans is designed to capture passwords and credit card information.

2002
Author of the Melissa virus, David L. Smith, is sentenced to 20 months in federal prison. The LFM-926 virus appears in early January, displaying the message “Loading.Flash.Movie” as it infects Shockwave Flash (.swf) files. Celebrity named viruses continue with the “Shakira,” “Britney Spears,” and “Jennifer Lopez” viruses emerging. The Klez worm, an example of the increasing trend of worms that spread through email, overwrites files (its payload fills files with zeroes), creates hidden copies of the originals, and attempts to disable common anti-virus products. The Bugbear worm also makes it first appearance in September. It is a complex worm with many methods of infecting systems.

2003
In January the relatively benign “Slammer” (Sapphire) worm becomes the fastest spreading worm to date, infecting 75,000 computers in approximately ten minutes, doubling its numbers every 8.5 seconds in its first minute of infection. The Sobig worm becomes the one of the first to join the spam community. Infected computer systems have the potential to become spam relay points and spamming techniques are used to mass-mail copies of the worm to potential victims.

2004
In January a computer worm, called MyDoom or Novarg, spreads through emails and file-sharing software faster than any previous virus or worm. MyDoom entices email recipients to open an attachment that allows hackers to access the hard drive of the infected computer. The intended goal is a “denial of service attack” on the SCO Group, a company that is suing various groups for using an open-source version of its Unix programming language. SCO offers a $250,000 reward to anyone giving information that leads to the arrest and conviction of the people who wrote the worm.
An estimated one million computers running Windows are affected by the fast-spreading Sasser computer worm in May. Victims include businesses, such as British Airways, banks, and government offices, including Britain's Coast Guard. The worm does not cause irreparable harm to computers or data, but it does slow computers and cause some to quit or reboot without explanation. The Sasser worm is different than other viruses in that users do not have to open a file attachment to be affected by it. Instead, the worm seeks out computers with a security flaw and then sabotages them. An 18-year-old German high school student confessed to creating the worm. He's suspected of releasing another version of the virus.

2005
March saw the world's first cell phone virus: Commwarrior-A. The virus probably originated in Russia, and it spread via text message. In the final analysis, Commwarrior-A only infected 60 phones, but it raised the specter of many more—and more effective—cell phone viruses.

2008
First discovered in November, the Conficker virus is thought to be the largest computer worm since Slammer of 2003. It's estimated that the worm infected somewhere between nine and 15 million server systems worldwide, including servers in the French Navy, the UK Ministry of Defense, the Norwegian Police, and other large government organizations. Since it's discovery, at least five variants of the virus have been released. Authorities think that the authors of Conficker may be releasing these variants to keep up with efforts to kill the virus.

How Do Computers Work?

Computer Basics

To accomplish a task using a computer, you need a combination of hardware, software, and input.

Computer Hardware
Hardware consists of devices, like the computer itself, the monitor, keyboard, printer, mouse and speakers. Inside your computer there are more bits of hardware, including the motherboard, where you would find the main processing chips that make up the central processing unit (CPU). The hardware processes the commands it receives from the software, and performs tasks or calculations.
Software is the name given to the programs that you install on the computer to perform certain types of activities. There is operating system software, such as the Apple OS for a Macintosh, or Windows 95 or Windows 98 for a PC. There is also application software, like the games we play or the tools we use to compose letters or do math problems.
cartoon of computer inputting its own information
You provide the input. When you type a command or click on an icon, you are telling the computer what to do. That is called input.

How They Work Together

drawing of 2 people using a computer
First, you provide input when you turn on the computer. Then the system software tells the CPU to start up certain programs and to turn on some hardware devices so that they are ready for more input from you. This whole process is called booting up.
The next step happens when you choose a program you want to use. You click on the icon or enter a command to start the program. Let's use the example of an Internet browser. Once the program has started, it is ready for your instructions. You either enter an address (called a URL, which stands for Uniform Resource Locator), or click on an address you've saved already. In either case, the computer now knows what you want it to do. The browser software then goes out to find that address, starting up other hardware devices, such as a modem, when it needs them. If it is able to find the correct address, the browser will then tell your computer to send the information from the web page over the phone wire or cable to your computer. Eventually, you see the web site you were looking for.
diagram of computer downloading information
If you decide you want to print the page, you click on the printer icon. Again, you have provided input to tell the computer what to do. The browser software determines whether you have a printer attached to your computer, and whether it is turned on. It may remind you to turn on the printer, then send the information about the web page from your computer over the cable to the printer, where it is printed out.

The Dawn of an Electronic Era

The computer age began when ENIAC (Electronic Numerical Integrator and Calculator) was completed in 1945. The first multipurpose computer, ENIAC set speed records with an amazing 5,000 additions per second. Computers have come a long way since—a laptop today can do 500,000,000 additions per second.
That’s not the only difference. ENIAC weighed more than 30 tons, filled an 1,800-square-foot room and included 6,000 manual switches. It used so much electricity that it sometimes caused power shortages in its home city of Philadelphia. By contrast, a notebook PC today might weigh in at about 3 pounds.

Booting Up

You may know that “booting” your computer means starting it up. But did you know the word comes from “pulling yourself up by your bootstraps”? That’s an expression that means taking charge of yourself, which is what a computer seems to do when it starts up!

Bugging Out


Moth Illustration
The term “bug” has been used for problems in machinery since electricity was invented. But the first computer bug was actually a moth! In 1945, a computer being tested at Harvard University stalled when a moth got caught inside. The engineers taped the moth into their computer log with the note, “First actual case of bug being found.”

Computer Timeline



1945
The computer age begins with the debut of ENIAC (Electronic Numerical Integrator and Calculator). It is the first multipurpose computer.
1975
The MITS Altair, a PC-building kit, hits stores
Bill Gates and Paul Allen establish Microsoft.
1976
Steven Jobs and Stephen Wozniak start Apple Computer.
1977
Apple Computer introduces the Apple II computer.
1978
Floppy disks replace older data cassettes.
1981
IBM introduces a complete desktop PC
1983
TIME magazine names the PC “Man of the Year”
1984
The user-friendly Apple Macintosh goes on sale
1985
Microsoft launches Windows.
1992
The Apple PowerBook and IBM ThinkPad debut
1996
Palm releases the PalmPilot, a hand-held computer also called a “personal digital assistant.”

Google Login and Claiming Your Places Page Instructions

Here I am going to explain two separate ways to either claim or do a Google places login
The first thing before you try to claim and optimize your Google Places page is to have a Gmail account. If you do not have one set up already simply go to your browser and do a search for Google mail or you can find them at mail.google.com. This will take you to the Google log in page where you can either sign into your account or just underneath the sign in section is a section where you can Create an Account for free in about 2 minutes or less.
Creating a Gmail Account: Put in your name Choose your user name Check availability if it is available they will assign it to you if not they will give suggestions Choose a password Answer a couple of security Questions Put in a recovery email in case you forget the password. Select your country Put in you Date of Birth Fill in the Captcha and submit You may be asked to verify by phone or text select which works for you and your account is verified instantly.
Now that is done go to the Google Login page and sign in
To find and claim your places page click on the maps tab at the top of your browser if you do not see the maps tab you can do a search for "google maps" it will be the first organic listing, click on it and it will bring you to the maps search browser.
There are two ways to find out if you already have a Google places page listing. The first way is once you are in the maps search browser simply do a search of your company starting with the company name your address and zip. and click search. If you have an unclaimed listing it will show up in the left hand side bar.
If you show up in the side bar simply click on your company name and it will take you to your places page that the public sees. From here you will see you will see a map to the right with an A pin indicating your location.
Just above the map location with the A pin are two hyperlinks one says Edit this place the other say Business Owner? You should still be logged into your Gmail account so click on the business owner? hyperlink if you are not logged in you will be directed to the Google login page. If you are logged in you will be taken to a page that asks you what you would like to do.
You have 3 choices Choose Edit. Edit your listing, Suspend your listing, or This isn't my listing.
Once you choose Edit it will open a new page that will auto fill your information for you. Make any corrections and do a simple basic setup before you start to optimize your places page. I repeat just put in some basic information.
IMPORTANT: DO NOT stuff or add any additional information like city or what you do in your business name box just put it as it is on your business license. If you put wrong information in or if you accidentally break the rules or guidelines you will hurt yourself and your chances of optimizing your places page the right way to be seen on page 1 for your business will become more difficult.
The second way of finding yourself is when in the maps search browser in the left hand side bar is a link that says "put your business on Google Maps. click on this link if you are signed into your Gmail account it will take you to a page where you can put your business phone number in.
Use your local number not an 800 number preferably the number that is listed with the local phone books and other directories. Google will then search for your business, if it finds you they will give you an option to edit your page from the search results they came up with or if you do not see your listing this is where you can Add a new Listing.
Select add new listing and follow the procedures above for filling in the basic setup.
Do not get creative I can not emphasis enough on following the guidelines and best practices. Once you have done your basic setup you will want to optimize your Places page so you can start to rank better but hat is another article so until then have a great day.
I hope this Google login information helped you.
Jay Smith from Condor Marketing has been Helping clients with on page off page SEO since 2008, Authors of the Google Places Optimization Course helping clients world wide achieve their Goals with SEO and Google Places Optimization. The Google course has reached 5 countries to date. Visit us to Learn how to optimize your Google Places page
Article Source: http://EzineArticles.com/?expert=Jay_W_Smith

Article Source: http://EzineArticles.com/6413921

Are "Free Websites" Really Free?

This is a great question. The obvious answer is yes. However, what exactly are you going to get for free? If a free website doesn't function well enough to allow you to post your important content or accept emails and hyperlinks, it really isn't worth very much.
However, there are some companies out there which offer great services for free with the hopes you will later upgrade to your own domain name and a professional account plan. This is where they will one day make a small profit. This is why their free service must be generous but still allow for the temptation to upgrade to a "Paid" account.
I've personally built several sites for the purpose of verifying the capabilities of some of these so called "free websites". I've been quite disappointed on many occasions but there have been a couple of companies that truly give you what they offer. So I guess the real question here would be... Are you looking to have your own Web Site but don't know where to begin?
Many people would love to have their own site but become intimidated when they try to build one using the free services available. It does take time and a little finesse, however, with a little coaching along it is possible for almost anyone to develop their own site.
Using many of the free web site offers found in a quick search engine query, you can easily create your own Web Site and have the basic site "Up and Running" in less than an hour. Of course, fully developing your new website will be a much longer process. However, building and enhancing your own website is a very rewarding experience. It can also be a lot of fun.
The level of features offered by many of the free hosting companies are quite extensive. I work with one particular company that allows video uploads, email links, HTML links and music capabilities via mp3 files. All these features are completely free. Some of these free websites allow links for referrals to some of your favorite sites. I now have a "Pro-Level" account but initially, I used a free hosting company for three months and I was already earning income after the first two weeks so I was more than happy to spend a few dollars of my profits to upgrade to a site that offered me unlimited options and features.
If you are looking to build a free website, feel free to contact me personally and I'd be glad to help you get started with your site for no charge and no obligation. Please feel free to contact me (Brian) at my website listed below...
If I can ever be of assistance, please feel free to Contact me, Brian V. Menard, for more information. Or visit my website at http://www.webdini.com
Article Source: http://EzineArticles.com/?expert=Brian_V_Menard

Article Source: http://EzineArticles.com/6429903

Your Small Business and What Apps Can Do For It

It is one of the best things that could happen to small businesses everywhere. What do you need to attract more business? What do you need to help you communicate with your customer base? The answer is the app. An app is short for application program. An application program is written and designed for a specific need or purpose.
We are in the age of technology and if you are not up to speed with the new app technology your company will have a hard time competing in a global market. The technical side of small businesses is always changing. Once more small businesses start to explore the benefits of using mobile apps, they will discover it is one of the better, if not best ways to reach, grow and communicate with your customer base. Mobile apps can help you to have a more intimate interaction with your clients or customers wherever you are.
If you want your small business to compete in this new technical arena it's best that you be prepared. There are literally thousands of apps available to help you get your small business in touch with the future. Some have a fee and some don't. Don't take shortcuts when it comes to apps because you get what you pay for.
If you are looking to increase your sales, there's an app for that. Or, maybe you are looking for apps that help you design a spreadsheet or a survey. Today's small business has to multitask in order to keep up with the constant growing needs of their company. If you are looking for increased productivity then perhaps your business can benefit from a management app. There are apps to help with the administrative side of your business, security and file back up. There is an app for almost every aspect of your business and if not, one can be made for you. Apps make your business task much easier to handle. It's like having a personal assistant at your fingertips no matter where you are.
Apps are here to stay and they are becoming an integral part of our society worldwide. If you plan for your business to stay in the race and grow than now is the time to hitch your business to the app wagon. It is the only way you will be able survive in an app ran market place. While you are considering if apps are right for your business, your competition may be using apps to run his business faster and easier than ever before.
From Author: Thanks for reading my article. SocietyM Business Club & Meeting Spaces etc. are mainly discussed here. Hope you enjoyed this article.
Article Source: http://EzineArticles.com/?expert=Craig_M_Jackson

Article Source: http://EzineArticles.com/6446580

Please Note: Performing your original search

Quarterly Journal of Economics

February 2002, Vol. 117, No. 1, Pages 339-376
Posted Online March 13, 2006.
(doi:10.1162/003355302753399526)
© 2002 President and Fellows of Harvard College and the Massachusetts Institute of Technology.
The requested content format does not exist for this article. Here is the article's abstract.

Information Technology, Workplace Organization, and the Demand for Skilled Labor: Firm-Level Evidence*

Timothy F. Bresnahan
Department of Economics, Stanford University
Erik Brynjolfsson
Sloan School of Management, Massachusetts Institute of Technology
Lorin M. Hitt
Wharton School, University of Pennsylvania
We investigate the hypothesis that the combination of three related innovations—1) information technology (IT), 2) complementary workplace reorganization, and 3) new products and services—constitute a significant skill-biased technical change affecting labor demand in the United States. Using detailed firm-level data, we find evidence of complementarities among all three of these innovations in factor demand and productivity regressions. In addition, firms that adopt these innovations tend to use more skilled labor. The effects of IT on labor demand are greater when IT is combined with the particular organizational investments we identify, highlighting the importance of IT-enabled organizational change.

Cited by

David Castillo-Merino, Dolors Plana-Erta. (2012) Financial Needs for a Competitive Business Model in the Knowledge Society. International Journal of Knowledge Society Research 1:4, 69-79
Online publication date: 1-Oct-2012.
CrossRef
J. Gilbert Silvius. (2012) A Conceptual Model for Aligning IT Valuation Methods. International Journal of IT/Business Alignment and Governance 1:3, 36-54
Online publication date: 1-Sep-2012.
CrossRef
(2012) Exploring the Role of IS in Dynamic Capabilities. International Journal of Strategic Information Technology and Applications 1:2,
Online publication date: 1-Aug-2012.
CrossRef
(2012) Innovation, Information Systems Strategic Alignment and Business Value. International Journal of Strategic Information Technology and Applications 1:2,
Online publication date: 1-Aug-2012.
CrossRef
Hyunbae Chun, Jung-Wook Kim, Randall Morck. (2011) Varying Heterogeneity among U.S. Firms: Facts and Implications. Review of Economics and Statistics 93:3, 1034-1052
Online publication date: 1-Aug-2011.
Abstract | PDF (702 KB) | PDF Plus (531 KB) 
Kunsoo Han, Young Bong Chang, Jungpil Hahn. (2011) Information Technology Spillover and Productivity: The Role of Information Technology Intensity and Competition. Journal of Management Information Systems 28:1, 115-146
Online publication date: 1-Jul-2011.
CrossRef
I. M. Bodas Freitas. (2011) Technological learning environments and organizational practices--cross-sectoral evidence from Britain. Industrial and Corporate Change
Online publication date: 6-Jun-2011.
CrossRef
 
 
 
 

Global information technology outsourcing: in search of business advantage

Couverture de l'ouvrage Global information technology outsourcing: in search of business advantage
Prix indicatif 60,25 €
Disponible chez l'éditeur (délai d'approvisionnement : 7 jours).
Ajouter au panierAjouter au panier
Date de parution : 10-2000
Langue : ANGLAIS
300p. 23.5x16 Hardback

Résumé de Global information technology outsourcing: in search of...

During the 1980s large corporations testified to the competitive advantage that can be achieved through the successful exploitation of IT. However by the 1990s Kodak was one of the first Fortune 500 companies to argue that IT was primarily a commodity, best handled by expert vendors rather than a coveted strategic asset. While many early deals focused on cost reduction, organisations now seek dynamic groups that respond quickly to business needs and opportunities. Readers will learn how to achieve significant business advantage by developing flexibility and retaining control of IT outsourcing. l Based on research with 75 worldwide organisations including Dupont, British Aerospace and BP Exploration l Authors offer from the evidence gathered the best practice through this complex and ever-developing market

Sommaire de Global information technology outsourcing: in search of...

Acknowledgements

Introduction

Chapter One: Overview

Chapter Two: Global Trends and Practices: An Assessment

Chapter Three: Case Studies in Mega Contracting (A) South Australia and Dupont

Chapter Four: Case Studies in Mega Contracting (B) British Aerospace and UK Inland Revenue

Chapter Five: Proven Practices in Evaluating and Negotiating IT Outsourcing Deals

Chapter Six: Making IT Sourcing Decisions: A Framework

Chapter Seven: Preparing for Outsourcing: Risk Mitigation for Business Advantage

Chapter Eight: Preparing for Outsourcing: The Core IT Capabilities Framework

Chapter Nine: Managing Stakeholder Relationships Across Six Phases

Chapter Ten: In Conclusion: Future Outsourcing

Notes

Appendix A 1999/2000 Survey Findings From the USA and UK

Appendix B Case Studies Profiles

Guide to Authors' Publications on IT Outsourcing

Index

Gender Differences in Availability, Internet Access and Rate of Usage of Computers among Distance Education Learners

Abstract:
This study explores the level of availability of computers, Internet accessibility and the rate of usage of computers both at home and at the work place between distance education learners according to gender. The results reveal that there are no significant differences in all three aspects. The findings indicate that female distance education learners participate equally with their male counterparts in the utilization of computer technology to assist their study requirements as well as in their involvement in information and communication technology (ICT) to support the educational and learning process as demanded by distance education.Différences de sexe dans la disponibilité, l'accès à internet et la fréquence d'utilisation des ordinateurs chez les étudiants apprenant à distanceCette étude explore le niveau de disponibilité, l'accès à internet et la fréquence d'utilisation des ordinateurs à la fois au travail et à la maison chez les étudiants apprenant à distance en fonction de leur sexe. Les résultats montrent qu'il n'y a pas de différences significatives dans chacun de ces domaines. Les résultats indique que les femmes participent à égalité avec les hommes dans l'utilisation de l'ordinateur pour les aider à faire face aux exigences de leurs études aussi bon que dans leur implication dans les technologies de l'information et à la communication (TIC) pour soutenir le processus d'éducation et d'apprentissage imposé par l'éducation à distance.Geschlechtsbezogne Unterschiede bei der Verfügbarkeit eines Internetzugangs und Nutzung von Computern bei Studenten des FernunterrichtsDiese Studie untersucht die geschlechtsbezogne Verfügbarkeit von Computern, Internetzugang und Häufigkeit der Computernutzung daheim und am Arbeitsplatz bei Studenten des Fernunterrichts. Die Resultate zeigen, dass es bei den 3 Aspekten keine grossen Unterschiede gibt. Die Untersuchungen belegen, dass Frauen genau so oft wie ihre männlichen Kollegen die Computertechnologie zu Studienzwecken einsetzen und die Informations- und Kommunikationstechnologie (ITC) für ihren Lernprozess nutzen, so wie es im Fernunterricht verlangt wird.
Document Type: Research article
DOI: 10.1080/09523980210166459
Affiliations: 1: Penang, Malaysia
Publication date: 2002-09-01
More about this publication?

Computers in healthcare: overview and bibliography

OBJECTIVE: The objective of this article is to provide an overview of computer technology and an associated bibliography, emphasizing institutional-based healthcare applications and pharmacoinformatics. DATA SOURCES: References were selected from the authors' files and from a computerized search over the last five years on computers in healthcare/medical informatics and in pharmacy. STUDY SELECTION: Articles selected for review and discussion were considered to be important contributions to the respective areas listed in the bibliography and representative of advancements in computer applications in healthcare and pharmacy. DATA SYNTHESIS: The computer has become an important support tool for healthcare professionals. Medical informatics and the discipline related to pharmacy, called pharmacoinformatics, have evolved from the cognitive underpinnings of medicine, pharmacy, and computer science. Recent developments in computer technology have resulted in computers that are fast, increasingly portable, and user friendly. Hospital information systems employ computers in various ways to deal with the vast amount of information used by various departments. Standards for electronic data exchange are being developed to increase the integration potential of these systems. Hospital pharmacists have used computers for drug distribution, financial analysis and inventory control, drug interaction detection, pharmacokinetic dosing, drug information, and drug therapy monitoring. Expert systems are being developed in several areas of drug therapy. Pharmacy educators have developed interactive courseware to help students learn problem-solving skills in the areas of calculations, therapeutics, and drug information. CONCLUSIONS: Pharmacists need to become more involved with applications of technology to pharmacy. Properly implemented, computers can provide more time for pharmacists to use their cognitive skills in the delivery of pharmaceutical care.

Improving Safety with Information Technology


Health care is growing increasingly complex, and most clinical research focuses on new approaches to diagnosis and treatment. In contrast, relatively little effort has been targeted at the perfection of operational systems, which are partly responsible for the well-documented problems with medical safety.1 If medicine is to achieve major gains in quality, it must be transformed, and information technology will play a key part,2 especially with respect to safety. In other industries, information technology has made possible what has been called “mass customization” — the efficient and reliable production of goods and services according to the highly personalized needs of individual customers.2 Computer retailers, for example, now use their Web sites to allow people to purchase computers built to their exact specifications, which can be shipped within two days. Medical care is, of course, orders of magnitude more complex than selling personal computers, and clinicians have always strived to provide carefully individualized care. However, safe care now requires a degree of individualization that is becoming unimaginable without computerized decision support. For example, computer systems can instantaneously identify interactions among a patient's medications. Even today, more than 600 drugs require adjustment of doses for multiple levels of renal dysfunction, a task that is poorly performed by human prescribers without assistance but can be done accurately by computers.3 Multiple studies now demonstrate that computer-based decision support can improve physicians' performance and, in some instances, patient outcomes.3-6 In the past decade, the risk of harm caused by medical care has received increasing scrutiny.1 The growing sophistication of computers and software should allow information technology to play a vital part in reducing that risk — by streamlining care, catching and correcting errors, assisting with decisions, and providing feedback on performance. Given the large potential risks and benefits as well as the costs involved, in this article we analyze what is known about the role and effect of information technology with respect to safety and consider the implications for medical care, research, and policy.

Ways That Information Technology Can Reduce Errors

Information technology can reduce the rate of errors in three ways: by preventing errors and adverse events, by facilitating a more rapid response after an adverse event has occurred, and by tracking and providing feedback about adverse events. Data now show that information technology can reduce the frequency of errors of different types and probably the frequency of associated adverse events.7-18 The main classes of strategies for preventing errors and adverse events include tools that can improve communication, make knowledge more readily accessible, require key pieces of information (such as the dose of a drug), assist with calculations, perform checks in real time, assist with monitoring, and provide decision support.

Improving Communication

Failures of communication, particularly those that result from inadequate “handoffs” between clinicians, remain among the most common factors contributing to the occurrence of adverse events.19-21 In one study, cross-coverage of medical inpatients was associated with an increase by a factor of 5.2 in the risk of an adverse event.22 A new generation of technology — including computerized coverage systems for signing out, hand-held personal digital assistants (Figure 1Figure 1Notification about a Critical Laboratory Result.), and wireless access to electronic medical records — may improve the exchange of information, especially if links between various applications and a common clinical data base are in place, since many errors result from inadequate access to clinical data. In the study mentioned above, the implementation of a “coverage list” application, which standardized the information exchanged among clinicians, eliminated the excess risk resulting from cross-coverage.16 Also, many serious laboratory abnormalities — for example, hypokalemia and a decreasing hematocrit — require urgent action but occur relatively infrequently, often when a clinician is not at hand, and such results can be buried amid less critical data. Information systems can identify and rapidly communicate these problems to clinicians automatically (Figure 1), unlike traditional systems in which such results are communicated to a clerk for the unit.12-15 In one controlled trial, this approach reduced the time to the administration of appropriate treatment by 11 percent and reduced the duration of dangerous conditions in patients by 29 percent.23

Providing Access to Information

Another key to improving safety will be improving access to reference information. A wide range of textbooks, references on drugs, and tools for managing infectious disease, as well as access to the Medline data base, are already available for desktop and even hand-held computers (e.g., through http://www.epocrates.com and http://www.unboundmedicine.com). Ease and rapidity of use at the point of care were initially problematic but appear to be improving, and hand-held devices are now widely used, especially for drug-reference information.24

Requiring Information and Assisting with Calculations

One of the main benefits of using computers for clinical tasks that is often overlooked is that it makes it possible to implement “forcing functions” — features that restrict the way in which tasks may be performed. For example, prescriptions written on a computer can be forced to be legible and complete. Similarly, applications can require constraints on clinicians' choices regarding the dose or route of administration of a potentially dangerous medication. Thus, a dose that is 10 times as large as it should be will be ordered much less frequently if it is not one of the options on a menu (Figure 2Figure 2Percentage of Medication Orders with Doses Exceeding the Maximum.). Indeed, forcing functions have been found to be one of the primary ways in which computerized order entry by physicians reduces the rate of errors.26 The usefulness of forcing functions may also apply to other types of information technology. For example, bar-coded patient-identification bracelets designed to prevent accidents, such as the performance in one patient of a procedure intended for another patient, function in this way.27 Similarly, many actions imply that another should be taken; these dependent actions have been termed “corollary orders” by Overhage et al.28 For example, prescribing bed rest for a patient would trigger the suggestion that the physician consider initiating prophylaxis against deep venous thrombosis. This approach — which essentially targets errors of omission — has resulted in a change in behavior in 46 percent of cases in the intervention group, as compared with 22 percent of cases in the control group, with regard to a broad range of actions.28 The use of computers can also reduce the frequency of errors of calculation, a common human failing.29 Such tools can be used on demand — for example, by a nurse in the calculation of an infusion rate.

Monitoring

Monitoring is inherently boring and is not performed well by humans. Moreover, so many data are collected now that it can be hard to sift through them to detect problems. However, if the monitoring of information is computerized, applications can perform this task, looking for relations and trends and highlighting them, which can permit clinicians to intervene before an adverse outcome occurs. For example, “smart” monitors can look for and highlight signals that suggest the occurrence of decompensation in a patient — signals that a human observer would often fail to detect (Figure 3Figure 3“Smart” Monitoring in an Intensive Care Unit.).30 A related approach that appears to be beneficial on the basis of early data is technology-enabled remote monitoring of intensive care. In one study, remote monitoring in a 10-bed intensive care unit was associated with a reduction in mortality of 68 percent and 46 percent as compared with two different base-line periods, and the average length of stay in the intensive care unit and related costs each decreased by about a third.17 Such monitoring is especially attractive in the intensive care unit because there is a national shortage of intensivists.

Decision Support

Information systems can assist in the flow of care in many important ways by making available such key information on patients as laboratory values, by calculating weight-based doses of medications, or by red-flagging patients for whom an order for imaging with intravenous contrast material may be inappropriate. A longer-term benefit will occur as more sophisticated tools — such as computerized algorithms and neural networks — become integrated with the provision of health care. Neural-network decision aids allow many factors to be considered simultaneously in order to predict a specific outcome. These tools have been developed in order to reduce diagnostic and treatment errors in numerous clinical settings, including the assessment of abdominal pain, chest pain, and psychiatric emergencies and the interpretation of radiologic images and tissue specimens.31 Controlled trials have demonstrated improvement in clinical accuracy with the use of such technical tools, including their use in the diagnosis of myocardial infarction,32,33 the detection of breast cancer on screening mammograms,34 and the finding of cervical neoplasia on Papanicolaou smears.35 However, of these practices, only neural-network–assisted cervical screening has had substantial use, and little of that use has been in the United States.31,36 Nonetheless, more widespread use of electronic medical records could lead to an expanded role for these applications and make it easier to integrate them into routine care.

Rapid Response to and Tracking of Adverse Events

Computerized tools can also be used with electronic medical records to identify, intervene early in, and track the frequency of adverse events — a major gap in the current safety-related armamentarium — since, to improve processes, it is important to be able to measure outcomes.37 Classen et al. pioneered an approach for combing clinical data bases to detect signals that suggest the presence of an adverse drug event in hospitalized patients, such as the use of an antidote; this approach identified 81 times as many events as did spontaneous reporting, which is the standard technique used today.38 Others have built applications that allow the detection of nosocomial infections in inpatients39 and adverse drug events in outpatients.40 Such tools may be useful both for the improvement of care and for research. Together with Indiana University, we are conducting a controlled trial to evaluate computerized prescribing for outpatients. In the first year of this study, we built a computerized monitor for adverse drug events, which goes through the electronic medical record to detect signals (such as high serum drug levels) that suggest that an adverse drug event may have occurred (Table 1Table 1Results of Screening for Drug-Related Adverse Events with the Use of Electronic Medical Records for Outpatients.). This approach inexpensively identifies large numbers of adverse drug events that are not routinely detected. We are now using the rates of events to assess the effect of computerized prescribing, first with simple and then with more advanced decision support. Electronic tools designed to identify a broad array of adverse events in a variety of settings seem promising.41 Often, these signals may permit earlier intervention; for example, Raschke et al. found that 44 percent of the alerts generated by a tool that they built had not been identified by the team of clinicians.5

Medication Safety and the Prevention of Errors

After anesthesia, medication safety has perhaps been the most closely studied domain in patient safety. Efforts to reduce the rate of medication errors have involved all the strategies discussed above. Nearly half of serious medication errors have been found to result from the fact that clinicians have insufficient information about the patient and the drug. Other common factors include a failure to provide sufficient specificity in an order, illegibility of handwritten orders, errors of calculation, and errors in transcription.7 In one controlled trial involving inpatients, the implementation of a computerized application for order entry by physicians — which improves communication, makes knowledge accessible, includes appropriate constraints on choices of drugs, routes, frequencies, and doses, helps with calculations, performs real-time checks, and assists with monitoring — resulted in a 55 percent reduction in serious medication-related errors.8 In a further study, which evaluated serial improvements to this application with the addition of higher levels of support for clinical decisions (e.g., more comprehensive checking for drug allergies and drug–drug interactions), there was an 83 percent reduction in the overall rate of medication errors.9 The use of decision support for clinical decisions can also result in major reductions in the rate of complications associated with antibiotics, and can decrease costs and the rate of nosocomial infections.10 Other technological tools with substantial potential but less solid evidence of effectiveness include the bar coding of medications and the use of automated drug-delivery devices for both oral and intravenous medications.11

Summary of Approaches to Prevention

To date, studies have generally been conducted only in individual facilities and rarely in the outpatient setting; moreover, only a few types of technology have been well tested. However, the large benefits found in the improvement of fundamental aspects of patient care8,12,13,16-18 indicate that information technology can be an important tool for improving safety in many clinical settings. Tools that can improve communication, make knowledge more accessible, require key information, and assist with calculations and clinical decision making are available today and should provide substantial benefit. More research is needed on such questions as how best to perform checks, how best to assist in monitoring, and especially, how to provide decision support most effectively in complex situations. In today's systems, many important warnings are ignored,42 and there are too many unimportant warnings. Approaches have been developed to highlight more serious warnings — for instance, by displaying a skull and crossbones — when a clinician tries to order a drug that has previously caused an anaphylactic reaction in the patient (Figure 4Figure 4Warning Displayed for a Drug Allergy.). However, many efforts directed at complex targets such as the management of hypertension44 or congestive heart failure45 have failed. Overcoming these difficulties will require bringing cognitive engineers and techniques for assessing and accommodating human factors, such as usability testing, into the design of medical processes.

Barriers and Directions for Improvement

Despite the substantial opportunities for improvement in patient safety, the development, testing, and adoption of information technology remain limited. Numerous barriers exist, although some approaches to overcoming them are at hand.

Financial Barriers

The development of medical applications of information technology has largely been commercially funded, and reimbursement has rewarded excellent billing rather than outstanding clinical care. As a result, the focus has been more on products to improve the “back-office” functions related to clinical practice than on those that might improve clinical practice itself. Since they depend on new capital, research and development efforts for clinical tools have had relatively limited funding. When companies have produced useful technological tools, their spending on clinical testing has been negligible, particularly in comparison with what is spent on the testing of medical devices or drugs.46 Furthermore, even for proven applications, such as computerized order entry for physicians, vendors do not have ready-made products.47 For clinicians and institutions seeking to adopt technological tools, the investment costs can be high,48 and the quality of the decision support that comes along with these applications remains highly variable.49 Progress on this front is unlikely to occur without considerable investment — particularly public investment — in clinical information technology. Incentives could make an important difference. To increase capital investment, legislation has been introduced in the U.S. Senate to provide nearly $1 billion over a period of 10 years to hospitals and Medicare-supported nursing homes that implement technology that improves medication safety.50 Of concern, however, are measures that mandate the adoption of such technology without providing the funding for doing so. California, for example, has passed a law requiring, as a condition of licensure, that all nonrural hospitals implement technology such as, but not limited to, computerized order entry for physicians by January 1, 2005.51 Neither an increase in reimbursement nor capital grants were provided to help hospitals to meet this requirement. A piece of national legislation in this area — the Patient Safety Improvement Act of 2003 (H.R. 877) — was passed by the House of Representatives on March 12, 2003. This bill would provide $50 million in grants over a two-year period to institutions that implement information technology intended to improve patient safety. Forms of technology that are named include electronic communication of patient data, computerized order entry by physicians, bar coding, and data support technology. Although this is a positive development, these incentives are sufficiently limited that their effect would most likely be small.52

Lack of Standards

We lack a single standard in the United States today for representation of most types of key clinical data, including conditions, procedures, medications, and laboratory data.53 The result has been that most applications do not communicate well, even within organizations, and the costs of interfaces are high. Another highly charged issue is that standards for some important types of data are privately held. Privately held standards are standards that are in general use but are licensed by a company or organization. Examples of privately held standards are diagnosis codes that are licensed by the College of American Pathologists and procedure codes that are licensed by the American Medical Association. However, there are both short-term and longer-term opportunities in this area. The National Committee on Vital and Health Statistics recently released a report54 endorsing national standards for electronic data for key domains. The adoption of the Consolidated Health Informatics standards by the federal government on March 21, 2003, represents a major step forward.55 This initial set includes standards for messaging, images, and clinical laboratory tests. Such standardization will encourage innovation and the adoption of applications with relatively little cost to the government. Although standards are not fully developed for every important type of information, the identification of this area as a major priority should make it possible to do the additional work required, especially if federal funding to support it is provided. An important, open question is whether any organization should be able to hold a national standard privately. We believe that one appropriate approach would be to require organizations to sell such classification systems for a fair price.

Cultural Barriers

There is also a tendency for clinicians and policymakers to see information technology as relatively unimportant for either research efforts or incorporation into medical practice. Academic centers are more apt to seek and reward faculty members who pursue research on a drug or a device that might lead to a reduction of 0.5 percent in the rate of death from myocardial infarction than those who develop a decision-support system that could result in a far greater reduction. Furthermore, clinicians have been reluctant to adopt information technology even when it has been shown to be effective. This reluctance appears to have a number of causes. It is still a new concept in medicine that computerized tools can have powerful benefits in practice. When errors occur, physicians are no less likely than the public to see the clinicians involved, rather than the system, as the central problem.2 In addition, many physicians are still uncomfortable with computers. Some are concerned about depending on them, particularly for clinical decision making. With regard to certain technological tools, such as e-mail between physicians and patients and electronic medical records, clinicians are also concerned about legal issues, including privacy. Not only the government but clinicians too, in their practices and relationships with colleagues and health care facilities, must recognize that most preventable adverse events result from failures of systems, not individual failures. Investment in and adoption of new forms of information technology must be understood as being as vital to good patient care as the adoption of new technological tools for diagnosis and treatment.

Current Situation

Overall, few of the types of information technology that may improve safety are widely implemented. For example, few hospitals have adopted computerized order entry for physicians. However, the Leapfrog Group — a coalition of some of the nation's largest employers, such as General Electric and General Motors — has identified it as one of three changes that they believe would most improve safety,56 and many hospitals are beginning on this path. Use of computer-assisted decision making in diagnosis and the planning of treatment remains rare. Furthermore, the quality of the clinical software applications that are currently being developed remains unclear. Especially given the absence of widely used standards, organizations have been reluctant to make large financial commitments, fearing that they will select a dead-end solution. Another pivotal issue is that information technology has been seen by many health care organizations as a commodity, like plumbing, rather than as a strategic resource that is vitally important to the delivery of care. Exceptions are institutions such as the health systems of the Department of Veterans Affairs and Kaiser, and reported data suggest these strategies have been successful.57-59

Conclusions

The fundamental difficulty in modern medical care is execution. Providing reliable, efficient, individualized care requires a degree of mastery of data and coordination that will be achievable only with the increased use of information technology. Information technology can substantially improve the safety of medical care by structuring actions, catching errors, and bringing evidence-based, patient-centered decision support to the point of care to allow necessary customization. New approaches that improve customization and gather and sift through reams of data to identify key changes in status and then notify key persons should prove to be especially important.
Supported in part by a grant (PO1 HS11534) from the Agency for Healthcare Research and Quality (to Dr. Bates). Dr. Bates reports having served as a paid lecturer for Eclipsys and as a consultant for MedManagement and Alaris. We are indebted to Amar Desai for comments on previous versions of this manuscript and to Anne Kittler for assistance with the preparation of the manuscript.

Source Information

From the Division of General Medicine and Primary Care, Department of Medicine (D.W.B.), and the Department of Surgery (A.A.G.), Brigham and Women's Hospital; the Center for Applied Medical Information Systems, Partners HealthCare System (D.W.B.); and Harvard Medical School (D.W.B., A.A.G.) — all in Boston. Address reprint requests to Dr. Bates at the Division of General Medicine and Primary Care, Brigham and Women's Hospital, 75 Francis St., Boston, MA 02115, or at .