Friday, December 14, 2007

Lab log #5

Part A:

Ethernet adapter Local Area Connection:Connection-specific DNS Suffix: bc.hsia.telus.net

IP Address: 192.160.0.130

Subnet Mask: 255.255.255.0

Default Gateway: 192.160.0.1

Connection-specific DNS suffix: ???

IP Address: All computers that are connected to the Internet have an easily identifiable numerical address, known as the IP Address; these allow computers to find each other, not unlike street addresses, as a matter of fact.

Subnet Mask: A “mask” that hides parts of an IP address that are used by all the computers hooked up the network. In essence, it determines where the IP address ends and the node number of an IP address begins.

Default Gateway: A gateway that is utilized in the case that there isn’t a gateway that’s specified for a given use.

Part B:

Interface: 192.168.0.103 --- 0x2

Internet Address: 192.168.0.1

Physical Address: -15-e9-76-7f-14

Type: Dynamic

Internet Address: A number that’s assigned in order to identify a network host on the Internet, it’s made up of three different parts, optional subnet number, the network number, and the host number.

Physical address: The “real” address given to a network card, it has to be put onto an address bus in order to access the physical memory bank or device.

Type: A descriptor of the operations and data that are able to be executed on/by the data.

Dynamic: (Specific dynamic unknown, so I’ll use the one at the top of my search) One of the sub-groups of RAM, DRAM need to be constantly “refreshed” in order to hold its data.

Bibliography – Google Define searches (Various sources)http://www.wikipedia.com/

Thursday, December 13, 2007

WLAN and you (And everyone else in range of the signal)

WLANs (Wireless Local Area Network) are an easy way for Internet users to connect to the Internet without the need of physically connecting their computer into a network. While certainly a useful feature, especially in areas that have many Internet users who need to go do their business while staying connected to the Internet, there are a number of offsets that can make this technology troublesome. Not only is it taxing on an Internet server, (Wireless Internet takes up more bandwidth for less uploading and downloading speed) but also dangerous. Most WLANs are easy for a experienced hacker to hijack, allowing them to download illegal programs etc., copy credit card information that’s sent across the wireless, and send viruses to other users, all while remaining just another anonymous user on the wireless. It’s nearly impossible to track a hacker who’s using someone else’s wireless Internet, and many innocent people have been prosecuted for the illegal acquirement of programs when in reality it was a hacker who was “piggybacking” the wireless signal. However, there are a number of ways to protect your wireless Internet (Or from going on a dangerous wireless Internet). Here are four different problems, each with they’re own solution, or at least a way of minimizing the damage.

1) Illegitimate wireless access points- An extremely effective form of scamming, a skilled black hat (A hacker who does hacking for the sake of causing damage. There are also other categories of hackers known as white hats are known as hackers for the benefit of others, such as drivers that boost a computer’s power, and grey hats, who just generally mess around with program codes.) hijacks a computer network and then, using special software, routes it through his (Or her) computer. This creates a wireless Internet access point that originates from his computer. Usually set up in an area where there is normally pay-for-wireless-internet, the hacker creates his own payment form for any hapless Internet users who connect to his server. In order to access the Internet, the user enters their credit card information onto the form so that they can be billed, meanwhile obliviously sending it to the hacker as well. In a matter of minutes a black hat can spend thousands of dollars using the pirated credit card information. The only true way to avoid this is by never entering your credit card information on an unknown network. If you MUST get on the Internet, make sure to ask an employee who would know about their wireless Internet network and what the specifics of it are, so as to avoid any confusion in which are the legitimate Internet providers.

2) The spread spectrum – Many 802.11 wireless LAN protocol standards utilize the spread spectrum. This special modulation technique was created back in World War 2 in order to prevent the jamming of radio signals. Later (Much later technology wise) LANs began to appear, roughly around and in the 1990’s. Back then the Internet companies declared that the spread spectrum was top-of-the-line security for early adopters of wireless networks. The spread spectrum operates by “spreading” connection codes in a discreet way, technically making it impossible for an illegitimate user to access the wireless Internet if they didn’t know the access codes. The flaw with this supposedly flawless system was that 802.11 standards force the code to be displayed publicly so that different companies would be able to communicate with each other. Because of this a any hacker with a 802.11-compliant radio NIC to be able to connect to the Internet, and because of this the spread spectrum is rendered nearly useless. While it is possible to disable the public code distribution, it isn’t particularly effective since every time you wanted the Internet to allow a new user to work on it, you have to give them the codes. This is not only a hassle, but also means one more mouth for the spectrum codes to be spread by.

3) WEP (Wireless Encryption Protection) – This old security fix encrypts each file that is sent through the wi-fi. This allows the user to send private e-mails, passwords, usernames, credit card numbers etc. without worrying about it getting into the wrong hands. Unfortunately, people place too much trust in it. A flawed system, it’s only able to make a code up to 40 characters, and due to government policies is not allowed to use special code characters. A half-decent hacker can find out the WEP code in a matter of minutes with the right software (For example, AirSnort). There are a number of ways to increase its reliability, such as randomizing the code (Hackers usually use “dictionary” searches, randomly testing codes from words from the dictionary to see if they unlock the system) and using certain types of software that will occasionally change the WEP code. However, the best way to solve the problem is to simply upgrade to WEP2/WAP2. So far its code has proven to be nigh-unbreakable.

4) SSIDs (Service Set Identifier) – The only security setting necessary for a LAN to be set up, it works by making it a requirement for the computer that’s trying to access the Internet to have the same SSID password as the LAN network. The problem with this is that most access points literally hand out their SSID code every broadcast (Not even second apart). Older computer hackers would have had to use a program such as AiroPeek to get the code, but now Windows XP has an automatic “sniffer” and can usually get the job done by itself. While the broadcaster can manually shut off the SSID from broadcasting its code, it can still be gleaned by a hacker if they use a technique known as “framing”. By tracking the frames sent to a receiver, they can copy the code from the receiver. Not only that, but most users of SSID don’t even bother to change the code from its manufacturer’s standard. A few hours of “war driving” (Driving around and actively hacking into Wi-fi enabled areas, then marking them down for later use and abuse) and you’ll have sets of the same SSID code. There isn’t much one can do to increase the security of SSID codes other then shutting off it’s broadcast. The best way to avoid such a problem is to use any of the other security features.



Bibliography –
http://en.wikipedia.org/wiki/WLAN
http://compnetworking.about.com/cs/wirelessproducts/g/bldef_wlan.htm
http://www.oreillynet.com/pub/a/wireless/2002/04/19/security.html
http://www.wi-fiplanet.com/tutorials/article.php/1457211

Monday, November 26, 2007

Electromagnetic Interference

What is EMI? : Electromagnetic interference is a common problem for those who use wireless signals, such as wi-fi Internet and/or radio signals. Also known as RFI (Radio frequency Interference), it’s usually caused by an external device that is releasing electromagnetic radiation. There are a variety of devices that can cause EMI, among them include:
Cell phones: The device for the working man on the go, it allows the user to receive a phone call while they are in range of a cellular network. The mild radiation it releases will sometimes cause some static and signal problems, the radiation has also been linked to causing a noticeable drop in a humans ability to suppress his or hers allergies, as well as causing what was previously known as "Tennis Elbow". (Due to the elbow holding the hand in an upright position for prolonged periods of time) It can be reduced in a workplace by having a no cell phone policy, any other way would require either electronic jamming, (which is illegal) or by using special signal-trapping paint (Which is not illegal). However, these both obviously also stop you from using your wireless device, so it’s really a double-edged sword.
Microwaves: One of the more notorious ones. While certainly annoying, it lacks the power to fully cripple a wireless signal. However, due to it’s common use many know of its annoying tendency to create static, especially among wireless home phone users. Keeping a wireless phone receiver in a different room then where the microwave is will definitely help, as a microwaves emissions aren’t nearly powerful to penetrate a wall.
EMPs (Electromagnetic Pulses): A very rare form of EMI, these are usually created when a nuclear bomb is set off. They are also created by what are known as "Pinches", a device that was made to re-create the effects (Usually for studying purposes) of an EMP released with the denotation of a nuke, while avoiding all of the messy destruction and environmental activists complaints. There isn’t much one can do about this, other then make sure they don’t live near a government testing facility, as it is more then powerful enough to thwart most standard EMI protection. In more sensitive electronics, it can even completely fry/destroy the internal parts of the device.
Magnetrons: An extremely powerful magnet, they can completely shut down a small area of sensitive electronic devices. However, they require a current to be running through them in order to create the interference. The more power, the larger and more potent of the electromagnets interference. Depending on the quality of the material, it will be able to create a larger area of interference with less power. They’re usually made of Ferro-magnetic elements such as Iron. By cutting off a magnetrons power supply, you eliminate it’s potential to cause EMI.
Generally, Electromagnetic Interference can be minimized (Or possibly eliminated) by a number of different methods. The most obvious one being just simply not using a EMI causing device near a susceptible wireless signal device. There are also materials and devices that help to reduce EMI emissions, or protect your device from EMI.

Friday, October 26, 2007

Assignment 1: Module B- Operating system reveiw

There are many different kinds of operating systems floating around out there, but theres one that stands out, it creates the most controversy, it has more ups and downs then a rollar coaster, and has one sexy interface. It’s goes by the name of Windows Vista, and it’s here to show us just what 5 years of "innovative" (Cough) research and programming can offer. Created by Microsoft, it was originally known by it’s code name "Longhorn". The upgrade was made for a number of reasons, and came with literally hundreds of reworked, updated and new features. Among the updated programs was an improved search engine (Named "Instant-Search" for it’s ability to search while typing), better peer-to-peer file sharing, (Mostly in the context of cameras, Xbox 360’s etc) and signifigantly increased internet security (Which was one of the largest flaws in Windows XP). New features include the completely redesigned graphic interface, dubbed "Aero" (Authentic, energetic, reflective, open), and Windows DVD Maker. Windows Vista allows the user to run Windows Vista only games and programs, increase the old 16GB of RAM maximum for Windows XP to a blistering fast 64GB RAM (Not that anyone could afford that much RAM, or need it, for that matter. Also, a number of hacks and custom retrofitting allowed a Windows XP to run up to 64GB of RAM), and with the upgraded OS graphics, the user is also able to pan out all of his or her current open windows, allowing them to see a small part of each page. Not particularly useful, but it does look nice. However, all of this eyecandy comes at a high price, the Windows Vista Ultimate’s Aero interface is incredibly taxing on a computers processes, and requires a minimum of 1GB of RAM to run, among other requirements. Not only that, but, excluding the ability to run Vista-only applications, any Windows XP user can download (For free) an Aero-esque look for their computer, including the semi-transparent page and the sidebar (Which holds a number of features, such as current weather and sport scores ect). It has also been criticized for it’s attempt at blocking the copying (AKA bootlegging) of digital media and it’s lack of drivers that are compatible with old peripherals. Not only that, but Apples new program "Parallel Interface" allows any Mac to run a Windows OS while simultaneously running the Mac’s OS too, this will soon be followed by a Windows-compatible version of Parallel Interface, allowing a Mac OS to be run on a normally Windows OS computer. However, there is a silver lining in this for Microsoft, Steve Jobs obsession for keeping everything about Apple exclusive to Mac’s means that the Mac OS that works with Parallel Interface will be purposfully crippled in order to keep Mac users from completely switching to standar Microsoft computers. While this OS is certainly a step up from previous versions (Windows ME coming to mind), it doesn’t really seem like a large enough jump for five years of programming, and the features that make it "unique" are available free off the internet, or, alternatively, will set you back $400 for Windows Vista Ultimate, the cheapest version of the Vista OS that is equipped with Aero. For the billions of dollars they probably invested in this, it just really doesn’t match up to Mac’s OSX Tiger, which is cheaper, more reliable, and had many of the "new" features that Vista now has years ago. By the time the new Mac OS comes out, (OSX Leapord) will probably have a decent following, however, I can honestly say the Leapord will probably trump Vista in most, if not all, areas.

Tuesday, October 23, 2007

Lab log #3

Question 3:
There are literally an infinite amount of arguments that could be waged over this. You have your over-bearing mothers who think that what would be better then no storage space would be no computer in the first place (And they also probably wouldn’t know which computer part is a storage device in the first place, barring it coming up and biting them in the butt). Then you have the nerds who think a terabyte is hardly enough to run solitaire on. But really, in reality, it’s not a question of how much we need, it’s a matter of if we need it in the first place (This follwing bit essentially goes against everything I love about computers, also known as shooting myself in the foot). We got on just fine without computers. Oh sure, they’ve certainly been helpful, but they don’t provide sustinence, (Unless you shop online) nor are they able to provide the pleasurable company which we require to procreate. Assuming we actually needed a computer though, there is no true answer to the stated question. Everyone has their own specific needs, one person might just have it to write reports on, another might need a couple terabytes of memory because they have a massive gaming rig, and don’t forget that companys need literally hundreds or even thousands of terabytes to play host to all of the information that they need to store. In truth, most people have more computer memory then they’ll ever use, therefore, we long ago surpassed the minimum requirement that the average consumer would need for their personal computer.
Question 4:
When you send a company your personal information, it should remain confidential. The less eyes that see it, the better. Now, some people may just not care, but most of the human populace would probably get the creeps if they found out that any random person could be looking at their personal files. Some have secrets to hide, others may be worried about their personal information being used against them, mabye others are just simply shy (In a sense). However, if theres one thing that’s worse then letting others see your personal documents, it’s selling them to the highest bidder. While this may not seem as bad, taking into account that less people will look at them, learning that people are profiting off your secrets would be nerve-racking enough, never mind worrying about who’s hands they’re in. There is no reason why a company shoul give away it’s clients peronsal files, for free or not.
I could let you see my personal files, but then I would have to kill you.

Thursday, October 18, 2007

Computer components

Price
Resolution
Display Quality
S-Video (Separate Video)
$15
480i and 576i
When used in conjunction with a component cable, is capable of 576p, 760p, 1080i, and 1080p.
Just S-Video- Good
With DVI component- High (It isn’t designed to handle high-def video(s))
DVI (Digital Visual Interface)
$50
576p, 760p, 1080p, 1080i
Very high, currently only topped by HDMI.
RGB (Red, Green, Blue) (Analog)
A.K.A. RGBA (Red, Green, Blue, Alpha)
$25
1080p
Fairly high

Pros
Cons
S-video
Flexible, able to have multiple resolutions. Capable of "Hot Swapping". (Able to be dis/re- connected while device is on)
Requires separate cable for sound. Resolutions other then standard 480i and 576i are inferior to a component that is designed for the higher resolutions.
DVI
High quality image. Can be used in part with HDMI. Since only one cable is needed to produce RGB colours, the information is transferred faster, this makes the image significantly
Expensive. Tech-savvy friends may drool on your high-def screen in wonder. HDMI component has a higher resolution then HDMI
RGB
If it is a SCART RGB display, the image is better then a standard S-Video image. Still used almost universally for computer monitors. (SCART is used for computer monitors) With SCART being used, it can handle both S-Video and RGB components.
Many different world wide standards mean that it lacks universal usage. SCART is rare outside of Europe. In order to maintain colour consistency during a movie etc. it must be routinely calibrated, this increases the likely hood of it being damaged by reducing its gamut. SCART is incapable of running both S-Video and RGB at the same time. Even with SCART it is incapable of producing images of other components (Excluding S-video) such as YpbPr. "Hot-swapping" runs the risk of damaging the device.

Monday, October 15, 2007

Computer Questions

Describe electricity.
Scientist’s definition: Electricity is only used to describe the process in which protons and neutrons produce a charge. I.E. Current/Quantity/Coulombs of electricity.
2. Standard definition: Electricity is the electromagnetic field that is given off by such things as generator and batteries.
3. School definition: The flow of charged electrons through a wire/circuit.
4. Other definition: Electricity refers to the imbalance of the number of protons and electrons.
What is a conductor? What is an insulator?
Conductors are materials that contain moving/movable positive and negative charges of energy. When the electrical potential difference is forced upon different parts of a conductor, the electrical charges inside of the conductors are forced to move. This creates an electrical difference, and an electric current between the two points appears according to Ohm’s law. In layman’s terms, A conductor is a material that actively conducts heat and electricity among other energies. They can also be used to transmit heat and/or electricity.
An insulator is a material that disallows an electrical current from running through it. The most common insulators are made of glass or rubber.
Describe voltage and how it works.
Voltage is a measure of the electrical potential. It can be kept track of by using a voltmeter that is running parallel to the main circuit. Voltage works by "pushing" a current through a circuit, the more voltage, the stronger the current.
What are the two internal voltages most commonly used by PC components?
Define current.
Current is the flow of electricity through a conductor, and is measured in Amperes (A).
What is the 1-10-100 rule?
The 1-10-100 rule is an example of how prevention of an accident is much easier then fixing the problem after it has occurred. For example, a company could spend one dollar to prevent an accident, or it could spend ten to rectify it after it occurs. And if the company didn’t fix the problem before they gave a customer the item with the problem, the cost would be 100 times more then if they had just prevented it in the first place.
How do you calculate the amount current running through a circuit?
Current (I)=Power (Watts) / Voltage (Volts)
I=P/V
Explain the differences between AC and DC.
The difference between the two is how the electricity flows. DC current works by placing an object that creates a magnetic field (I.E. magnet) beside a strip of wire. This forces the electrons to flow in one direction, since they’re negative charges are repelled by the negative end of the magnet and attracted by the polar opposite, also known as the positive end. This created an electric current, and a Directional Current (DC) was made, DC current was originally discovered by Thomas Edison. Nicholas Tesla invented AC (Alternating Current) current. The main difference between the two is that AC travels farther with less power loss and the power output can be manipulated. This is accomplished by spinning the magnet instead of leaving motionless.
Describe the human body's resistance capability.
The human body’s resistance capabilities vary wildly depending on the conditions of when an electrical shock is endured. For example, assuming the skin is clean and free of residue its conductivity is drastically reduced, so much so that a human can survive 20,000 volts of electricity. In tests in which scientists sprinkled water on a hand to simulate sweat, the body was susceptible to a much smaller amount of power, roughly 240V.
List 5 safety precautions you can take to avoid an electric shock.
Keep your skin dry and clean, the less residue on it, the higher your resistance will be.
Avoid using any electrical device near or in water.
Wear anti-shock gloves when handling potentially dangerous electrical devices or materials.
Try not to rub your feet on material such as carpet etc.
Turn off a device before unplugging it.
List 5 ways to prevent static build-up.
Make sure the area doesn’t get too dry, if it does, it becomes much easier for a static charge to build up.
You can buy anti-static bags etc. However, they are expensive.
Use dryer balls when drying your laundry, they absorb and dissipate the electric charge.
Avoid shuffling your feet, especially on carpet.
Don’t touch an electrode.
List 5 ways to prevent static discharge.
Rap your knuckles on a surface before touching it, it will release the static charge, but it won’t hurt as much as if it had left through your more sensitive fingers.
Wear leather shoes; in order to ground yourself.
Carry a coin and use it to test for a static charge.
Wear a metal thimble, it works best if it is contact with your skin.
If you regularly shock yourself, you can go one step further and wear a wristband with a wire that is attached to an electrical ground. With this on, the body is incapable of building up a charge.
Describe at least two components found in an ESD-prevention kit.
There are 2 wrist straps, 1 ground cord, 1 static-dissipating mat, 3 bauer bags, and three type II pouches.
What is the job of the PC power supply (besides delivering power)?
It also has a power surge protector to help prevent damage to the computer’s internal parts in the case of a power surge.
What are specific power supply requirements for: a motherboard, memory, CPU, hard drive, CD-ROM drive, and floppy drive? For which component of the PC does the Power_Good (or Power_OK) signal indicate? And why is it an essential part of the diagnostic start-up test?
Outline the differences between a Molex, Berg, and ATA power connector. What are they used for?
Molex: A term for "Pin and Socket" interconnection, it’s most usually used as a disk drive connector. First produced by Molex products company, it’s two-piece design eventually became an early computer industry standard. Originally used for home appliances, other companies and industries began integrating it into their machines. This included vehicles, mini computers, and even vending machines. This connector worked by using cylindrical spring metal pins that were made to fit into similarly shaped sockets. They typically had 2,3,4,5,6,9,12, or 15 circuits. A .062 pin is capable of carrying a maximum of 5 Amperes of current, while a .093 pin is able to carry 8.5 Amps.
Berg: The Berg connector is a type of electrical connector that is utilized in certain types of computer hardware. Manufactured by the "Berg Electronic Corporation" that’s based in St. Louis. There are many different kinds of berg connectors, the most familiar ones being the 4-pin berg, which is used to connect the floppy disc to the power supply. The next one is the 2-pin berg, which is used to connect the front panel lights, turbo switch, and the reset button to the computer’s mother board. Finally, there’s other 2-pin Berg, which is used as a jumper for a motherboard’s configuration.
ATA: Made mostly for the transfer of critical information between a computer and storage device, it features smaller wires that increase the wind flow, faster information transfer, the ability to hot swap, and improved
reliability

What are the two internal voltages most commonly used by PC components.
The average wattage used by personal computer components are 200W to 500W.
What is the relationship between electrical power and energy?
Electric power is measured at the rate which an electric current is sent through an electric circuit. This electricity is then used (Transformed) to produce varying forms of energy, including heat, light, and sound energy.
What are resistance and impedance and how are they different?
Resistance is a way of mesuring how much a given material will oppose and electrical current running through it, while impedance details a mesure of electrical opposition to a alternating current (AC). Impedance essentially expands on the theory of electrical resistance for an AC current, while resistence cover both AC and DC.
What are specific power supply requirements for: a motherboard, memory, CPU, hard drive, CD-ROM drive, and floppy drive?
A motherboard needs about 15-30 W of power, a mid-range CPU requires about 60 Watts, a RAM stick needs 7W for every 128 MB, an IDE hard drive needs 20 Watts, a CD-ROM needs about 15 watts, and a floppy drive needs 10 watts.

Wednesday, October 10, 2007

The Tech Talks

First off, let me say that talking to an uninitiated person who needs tech help is like trying to describe snow to a Hawaiian who has had no previous experience to it. You can describe it as thoroughly as you want, but there is no true way to convey that feeling of it fluttering down onto your finger, feeling the cold seep into your finger and that numb pain that you feel in your tows when wet snow seeps through your boots. Some people take it to the extremes, spending thousands of dollars on their computers, constantly modifying, updating, upgrading, and personalizing their computers into their dream machine. Their faces pale from lack of sunlight, and they begin to speak in l337, k1nd@ lik3 7h1$. Put them in the same room as a normal person and tell them to have a conversation, and they probably wouldn’t understand what the other was saying. However, that’s taking it to the edge, even trained professionals usually don’t reach that kind of technophile state. There are methods to get the message across, probably the best way would be to show the person what each part is, and explain it’s function etc. in detail. It may take awhile to get the message across, but eventually the person would begin to comprehend what you’re saying. 221
A personal translator could also work, although I don’t think that once has been made for tech talk yet. It’s always important to be able to converse with your equals, if they want to know what’s wrong with their computer, you can explain it to them. However, if they don’t know what you’re talking about, then to them it’s all just a bunch of jargon. It’s important to know what the other person is saying, if your don’t know what’s wrong, and don’t know the technology, a computer repairman could take you, and your wallet, for a ride. It happens all the time with car repair, and a computer is no different, at least when it comes down to a basic understanding of a computers inner workings.
USB (Universal serial port): The USB has become the standard way to interface devices to a computer. Desingned to allow exterior peripherials to be attached using a standarized interface socket, this greatly improved the "plug-and-play" philosophy. By using "Hot-swapping", it allowed a multitude of devices to be connected and disconnected without the need of the computer being restarted. It also has the ability to give power to low-consumption devices without the requirement of an external power supply and also allows some devices to be used without requiring an individual device driver to be installed. However, it is not designed for exceedingly complex machines, and as stated previously, it can only provide power to small and power-efficent devices.

SCSI (Small computer system interface): A set standard for physically connecting and sending data between computers and exterior/peripheral devices, th SCSI defines everything from commands, to protocols, electrical and optical interfaces. It’s most common utilization is for tape and hard drives, however, it can also connect to a number of other peripherals, an example being a scanner, printer, or an optical drive enabled device (I.E. CD, DVD drives). It contains the definitions of cammands that specify certain peripheral drive types. The presence of a "unknown" one of these types suggestions, at least in theory, that it could be interfaced with any SCSI enabled device, however, the SCSI’s standard is mainly addresed toward commercial needs. They’re also not very popular in the buisness sense, SATA drives are more then adaqute and are much cheaper. Today SCSI is mainly used for high-powered work stations and servers.

Firewire: An Apple Incoroporated interface, it’s also known as the IEEE 1394 interface. It also goes by the name i.Link, which is what Sony has decided to dub it. It is now a personal computer serial bus interface industry standard. It offers high-speed communications capabilities, as well as isochronous real-time data service. It has now asserted itself into many of the applications that Parallel SCSI once reigned, thanks to it’s lower implementation prices and a less complex, more adaptable cabling system. Apple originally used it for their hugely popular iPod’s, but was replaced by USB connection in part to it’s space constraints and for the USB’s more universal compatibility.

Parallel interface: A type of plug-in that is normally found on a standard personal computer, it’s used in order to interface with a number of different external devices or peripherals. It also goes by the name "Centronics port" or "Printer port". It is defined by the IEEE 1284 standard, which is a bi-directional variation of the port. I most cases, the advent of the USB interface has made the parallel port obsolete, while other peripherals use the more complex Ethernet connection in place of the parallel port. Since 2006, a large number of personal computers now lack a parallel port, mostly as a cost-cutting measure. However, it is still used by laptops when they are operated at a docking station.

Tuesday, September 11, 2007

A history of computers

1939- Hewlett-Pakard is founded by David Packard and Bill Hewlett in their Palo Alto, California garage. They’re original product is the computer system known as "HP 200A Audio Oscillator". This item soon becomes a popular piece of equipment for engineers, and Walt Disney uses eight of them for sound effect generation for the 1940 musical movie "Fantasia".
1940- Bell Telephone Company unveils the Complex Number Calculator (CNC). Designed by the researcher George Stibitz, he demonstrates its ability’s at the 1940 American Mathematical Society conference in Dartmouth College. He wows his audience by performing mathematical calculations on the CNC, which is located in New York. This is the first known instance of someone performing remote computer access. He completes this feat using Teletype connected via special telephone.
1941- The first Bombe is finished. Used to decode Nazi military transmissions during WWII, it was heavily influenced by the work of computer pioneer Alan Turing, among others. Many of them were manufactured and were put to use for the allied forces and dramatically improved intelligence gathering and the processing of information.
1942- The Atanasoff-Berry Computer is completed. This never-completed computer was built at Iowa State College by Professor John Vincent Atanasoff and College grad Cliff Berry. It was in production for 3 years, from 1939 to 1949.
1943- The U.S. Navy approaches Massachusetts Institute of Technology (MIT) to build a flight simulator for their pilot’s code-named "Whirlwind". Originally an over-sized analogue computer, this final product was too inaccurate and inflexible. After seeing the unveiling of the ENIAC computer, they change their priorities to create a digital version. By the time it’s completion in 1951 the Navy no longer is interested in it. However, they find an employer in the United States Air Force, and the "Whirlwind" becomes the influence for the "SAGE" programs.
1944- The first Colossus is assembled and made operational at Bletchley Park. Made to decode the Nazi code known as "Lorenz", they accelerate the code-breaking process from weeks to mere hours. A machine that earns its name, the materials used for the construction of the Colossus included over 1500 vacuum tubes and a large series of pulleys in order to deliver a continuos punch-card roll containing possible code solutions. The project wasn’t made known to the public until the 1970’s.
1945- John Von Neumann writes, "First Draft of a Report on the EDVAC" in which he outlined the specifications of a stored-program computer. Electronic storage and up-keeping of programming information and data eliminated the need for the more unwieldy methods of programming, such as punched paper tape — a concept that has shaped mainstream computer development since the 1940’s.
1946- In the moth of February, the ENIAC is unveiled to the general public. This new computer, built by John Mauchly and J. Presper Eckert, improved by 1,000 times on the speed of its lesser models.
1948- IBM’s Selective Sequence Electronic Calculator is made. Used to calculate computed scientific data in general public display near the company’s Manhattan headquarters. Before it was decommissioned in 1952, it was used to predict and produce moon movement tables for the 1969 Apollo flight to the moon.
1960- The forerunner to the mini-computer, DEC’s PDP-1 had a retail price of $120,000. Only 50 were built, the average PDP-1 included with a cathode ray tube graphic display, required no air conditioning and required only one user to operate it. It intrigued early hackers at MIT, who wrote the first computerized "video game", "SpaceWar!" for it. The SpaceWar! Creators then used the computer game as a standard demonstration on all 50 computers.
1971- Bolt Programmer Ray Tomlinson sends The first e-mail. Beranek and Newman send the first E-mail through the military network known as "ARPANET". Tomlinson is the member of the duo who decides to use the now universally recognized image @ in order to send his mail. Later, when asked about what the first E-mail contained, Ray answers "something like "QWERTYUIOP".
1972- Pong is released. In 1966, Ralph Baer designed a ping-pong game for his Odyssey gaming console. After Nolan Bushnell played "Pong" at a Magnavox product show in Burlingame, California, he hired a young engineer by the name of Al Alcorn to design a car driving game. However, when it became apparent that this was a too difficult undertaking for the time, he had Alcorn to design a version of ping-pong instead. The game was tested in bars and pubs in Grass Valley and Sunnyvale, California, where it proved immensely popular. Pong continued to revolutionize the arcade industry and launch the modern video game era.
1977- Atari releases the Video Computer System game console named the "Atari Video Computer system". Later renamed the Atari 2600, the VCS was the first truly successful video game system. Selling in excess of twenty million units throughout the 1980s, The VCS used the 8-bit MOS 6507 microprocessor and was made to be connected to a normal TV set. When the last of Atari’s 8-bit game consoles were made in 1990, more than 900 video game titles had been released. It was later toppled over by the "Nintendo Entertainment System", also known as the "NES".
1989- Maxis releases SimCity, a video game that will spawn numerous copiers. Will Wright, the co-founder of Maxis, based the game on his childhood interest of assembling plastic models of ships and airplanes. He eventually started up a programming company with Jeff Braun and designed a computer program that allowed the user to create his own "Sim City". The game was revolutionary for a number of reasons, one of which was the player’s ability to play "god" with their creations. Some of the later additions were the ability to unleash storms, earthquakes, and even quasi-Godzilla attacks upon the Sim City’s defenceless habitants.
1993- Now-legendary game producer "Id" releases "Doom". An immersive first-person shooter-style game, it received a number of awards, some more dubious than others. (These include Best video game weapon ever, the chainsaw, and the cleverest use of an acronym for a weapon, "The BFG") Doom players were also among the first to modify the game’s levels and appearance. Doom would spawn several sequels and a (Bad) 2005 film. (As with all films directed by infamously bad director Uwe Bole, it only served to lower everyone’s expectations of game-to-movie adaptations)
1998- Pixar´s "Tin Toy" becomes the first computer-animated film to win an Academy Award, taking the Oscar for "Best Animated Short Film". A wind-up toy first encountering a boisterous baby narrated "Tin Toy." In order to illustrate the baby’s facial expressions, the programmers had to create more then forty different facial "muscles".

Monday, September 10, 2007

Tech lab entry #1

Date: Monday, 2007, September 10th

When I originally joined this course, it was to learn how to dismantle, repair, and generally mess around with computers. Now that the class has actually started, it's looks like it's going to be much more extensive then I orginally thought, what with talks about installing linux on Xboxs, lan partys, and nagging the principal to install windows XP on the computers. (Although it would only be good for the planning classes) Hopefully my game know-how will prove useful, thanks to that fact that I know dozens of online and download games that prove perfect to run on these unweildy and slow clunkers. I just hope people don't notice my general lack of knowledge in what peices go where, or even what they're named. What I know, I have a good knowledge of, unfortunetly, thats not as expansive as I would like. Hopefully this class will remedy that.