Neve Input/Output Module – Also known as a Channel/Network
Invented by Rupert Neve est.1930
Mic Pre-Amp
Bus
Phantom Power – +47 Volts
Auxiliaries 1 and 2
Auxiliaries 3 and 4
Auxiliaries 5 and 6 (Headphone Mix)
Parametric Equalization (High Range 2khz-20kHz) - with Q-Margin
Parametric Equalization (Mid-High Range 400hz-2kHz) – with Q-Margin A.K.A trajectory
Parametric Equalization (Low-Mid Range 100hz-400Hz) with Q-Margin A.K.A trajectory.
Parametric Equalization (Low Range 12hz-100Hz) with Q-Margin A.K.A trajectory
Pan Pot – Left and Right
Channel Pot (Amplification) - Source
Switch
Monitor Fader- (Monitor-interpretation of Energy Source A.K.A Satellite)
In the movie E.T, Spielberg displays how a frequency can be transmitted from earth to other parts of the known Universe (many Galaxies). A received frequency is an antennae a monitoring device. In one of Spielberg’s earlier movies, Close Encounters of the Third Kind, music is the common language of communication between Alien life and Human life. Music in its most pristine form is synthesis of sound (or Specific, quantified frequencies- otherwise known as algorithms) transmitted from source to monitor (receptor-antennae). This is transference.
Microphones
This transference is done through microphones. A microphone picks up a signal and transmits it with colorization of sound then it amplifies the signal to a recording device, or a signal processing unit (commonly referred to as mixers and effects units). Common microphones include Ribbon, Dynamic, and Condenser (with a 47 volt boost known as phantom power).
Within each microphone there are several properties;
Polar Patterns – Cardioids, Hyper-Cardioids, Omni-Directional, Bi-directional, and Unidirectional
Spider – the size of the diaphragm inside the microphone, otherwise known as the capacity of the input of the microphone
Pickup – the sensitivity of the microphone, the dynamic range of the microphone
The Protocol Used – XLR, Quarter Inch, Eighth Inch, RCA
Signal Processing Units and Mixers
An effects unit is used in conjunction with a mixer to colorize the sound of the input before the recording (the output). An effects unit is synonymous to a signal processing unit as is a mixer. A mixer takes in signals from the microphone panel and processes all effects units with respect to the microphone signal path. The final mix-down is done into 2-channels left and right and recorded onto 2-track. Feeds from the mixing board almost always carry more than 2-tracks and this entails recording the mix-down onto a larger recording device, 8-Track (Tascam DA88, or Alesis A-DAT), 24-Track (studio recordings-Sony MPX, Westar, Neotek, SSL, Neve), 192-Track (for movies- Neve Capricorn) etc.
The signal path from the microphone to the recording device is as follows;
Microphone output – Microphone Panel – Channel -Phantom Power (if it’s a Condenser Microphone) – Microphone Pre-Amplification – Bus (to Assigned Channel) - Effects Unit (Reverb) – Outboard Gear (Compressor) – Equalization - Master Channels Left and Right – 2-Track Studer, and 24-Track Master.
Each channel line uses low-capacitance, enhancing the signal on recording (depth perception, reverb, special time, tube amplifiers). Each channel is processed left and right and are mastered onto one channel (the amplifier), otherwise known as Mono. An old Neve Strip comes to mind when I think of great signal processors (otherwise known as an amplifier – Variable Gain Amplifiers / V.G.A).
Monitors
There are three types of Monitors; Near-fields, Mid-fields, and Far-fields.
1) Near-Fields – The smallest drums of the three. They usually have only one crossover point separating the input signal above to the tweeds and below to the bass drum. They’re almost always used in control rooms of studios to gage high frequency response as the saying is; should your mix-down sound good on a near-field you’re going to get a good sound on anything.
2) Mid-fields – The mid-field monitor usually has only one crossover point similar to a near-field, however, the bass drum is usually twice as large as a near-field (8 inches compared to 4 inches). They are used mainly for studio performance in the control room to gage the low-end of the mix-down. They’re sometimes used in live sound in very small venues.
3) Farfields - The far-field is almost always used in live venues as they’re capable of booming at the highest amplitude compared to the other monitors. They usually have 2 crossover points; tweeds (high frequencies, a bass drum, and a sub-woofer (a bass drum that is 16 inches or more).
Solar Flares
This is an expected electrical storm in and around 2011. Within every satellite around the globe they collect and transmit energy from the sun as a monitor – into source. There is much speculation whether solar flares will obliterate every satellite feed in the years 2011-2012. This will result in every communication device carrying a ground hub as backup to every current ground hub backup essentially making our communications Worldwide UHF/VHF (or Short Wave) from optical (in other words back to analog using copper capacitance) within every ground feed. Currently the largest network in place is owned by Abbey Road and Disney Studio’s.
Interface Design
Channel to Channel – Web Conferencing that is software based also known as simultaneous broadcasting.
Search Engine to Search Engine – Input/Output module to the Master Input/Output module is the best analogy. In other words you mix down 24 channels down to left and right.
Intranet to Intranet – One company accessing the files of another company and vice-versa (considered to be a Local Area Network).
Intranet to Internet – This is when a Local Area Network is tied to a Wide Area Network (a.k.a. the internet).
Internet to Internet – One Wide Area Network tied to another Wide Area Network. Each Wide Area Network incorporates unlimited channel splitters to accommodate growth.
It is at the point of there being more than one Internet that information is multiplexed by a Governing body to be indiscriminant of absolute truth and to allow for unbiased communication Worldwide. Accountability is also promoted in this context. For example, one internet might incorporate free long distance to their customers while another internet might promote free Cable service. Likewise, an internet that contains elements of misinformation would be less desired than an internet that had higher standards for their users.
Transponders
In 1985 Citroen, (a French Auto-Manufacturer), carried a software base in the steering wheel with a G.P.S tracking device inserted for the purpose of future infrastructure (either by Ground or by Air). It implemented transponders for guidance and tracking of all motorized vehicles that carried this technology. This would mean that every vehicle with a G.P.S tracking device would be monitored and tracked by transponders over a relegated land.
The issue we see today is how far apart they must be placed. Separated by a minimum to minimize any and all feedback (distortion of signal), and by a maximum to prevent any and all loss of signal. Essentially this could take us to the point of every vehicle traveling above ground level as space in the future would be imperative.
Transformers, Algorithms, and Synthesis
The connecting of the dots sort of speak. Taking one algorithm and connecting it to another
algorithm by using the practice of reverse polarization (silicon – crystal tolerance), A.K.A as
Phase shift (or normalization – the premise of opposites attract by Sir Isaac Newton) in the digital domain – defined ions and the relationships with each-other. This is synonymous to a child playing with LEGO or building blocks. By creating connections from one algorithm to the next a picture takes place, and this is the basis of synthesis. Digital Editing Software include;
Pro-Tools by Digidesign
Sound-forge by Sonic Solutions (Sony)
Cu-base by Steinberg
Sonar by Cakewalk (I.B.M)
Sound Designer by Apple
Incorporating guard-bands, algorithms, envelopes, and filtering in converting sound into synthesis allows for an exact, precise measurement of Temporal and Spacial Time. This in its’ purest form is absolute zero (the gist). It is also known as a chronological event (legend), or chronicles – interpretations of events (documentation) / interpolation – Jedi.
Examples of synthesis (specific designs);
Moog
Oberheim
Hammond
Saw Tooth Waves
Triangle Waves
Sin Waves
Co-Sin Waves (reverse polarization in relation to a Sin Wave)
Square Waves
Each form of synthesis has its’ own set of guard-bands, algorithms, envelopes, and filtering.
Each transformer has its own circuitry and is interfaced systematically from transformer to transformer (and in the digital domain incorporates a transference using crystal tolerance -as precise as can be, to preserve data transfer through each resister component using Pulse Code Modulators in order to transfer digital information from the source to its’ monitors). In the old days (back in the 60’s) the highest level of tolerance used on resistors was Gold, and it’s still used in electronic designs for vintage productions.
Just an analogy;
Now you know everything. Will you wake up tomorrow?
The truth is that no-one will ever know everything. The Creator of the Universe has all the information in the World, and, all the Money in the World. She told me so. The Creator of the Universe uses 384 bit technology and she doesn’t want everyone to know everything she knows.
20-bit isn’t that bad.
Wednesday, April 29, 2009
Tuesday, April 28, 2009
The Chief Climate Alarmist by Gråulf
I woke up to 4 inches of snow on the ground this morning so it was fitting that I opened the morning newspaper to an article praising James Hansen. Hansen is the chief climate scientist at NASA Goddard Institute for Space Studies (GISS) and is the man who originally raised the alarm on global warming in 1988 in an appearance before congress. He is also the keeper of the most often cited climate data, and he is the author of the infamous “Hockey Stick” graph All Gore used in his movie. Then, back in January, Hansen wrote an open letter to Obama, warning him that human activity is causing greenhouse gas levels to rise so rapidly that his models suggests a runaway greenhouse effect, ultimately resulting in the loss of all life on the planet, unless Obama can stop the use of coal during the next 8 years.
There are problems with Hansen’s hockey stick graph. To demonstrate his assertion, that current global temperatures are hotter and more sudden than temperature fluctuations in the past, he minimizes the Medieval Warm period and the subsequent Little Ice Age. No one knows for sure where Hansen got his data, since he has refused to reveal his methodology, but speculation is that he used tree ring data. That is problematic, since the size of tree rings can vary due to many factors, such as precipitation, temperature, and forest density, and therefore needs to be compared to other factors to be of any use. The very fact that Hansen has not revealed his methodology should invalidate his hockey stick graph, because that makes it impossible to duplicate his research. Then a team of Canadian mathematicians discovered that Hansen made mistakes when he entered his date into his climate model, so that the outcome would always result in a “hockey stick” no matter what data was entered. Despite the problems with the hockey stick, the Intergovernmental Panel on Climate Change (IPCC) continues to use it.
One of the key tenets of the global warming alarmists is that nine of the ten warmest years on record have occurred since 1995, and the warmest year on record was 1998. A scientist by the name of Stephen McIntyre recently notified NASA that there were errors in their temperature records, and NASA quietly corrected their date to show that actually four of the hottest years on record occurred in the 1930’s, and the hottest year on record was in 1934. No one knows if the temperature records were deliberately tampered with, but the correction to the data has not changed the claim by global warming alarmists that this decade is the warmest on record. An interesting aside to the discussion about temperature records is some research done by retired TV meteorologist Anthony Watts. The U.S. Weather Service specified in 1889 that the shelters for weather instruments had to be whitewashed. Then, in the late 1970s the Weather Service changed that specification to white latex paint. Mr. Watts built three instrument shelters. One was white washed, one painted with white latex paint, and the third was left unpainted. He measured temperatures for several months, but typical among his results was one day in August when he found that the instruments in the unpainted shelter registered a maximum daytime temperature of 98.47 degrees, the instruments in the latex painted shelter showed 97.74 degrees, and the whitewashed shelter measured 96.94 degrees. In short, the change from white wash to latex paint caused almost a one-degree increase in the average temperature of the U.S.
James Hansen was one of the main authors of the IPCC’s report on global warming, but he disagrees with the reports estimate that the sea level rise will be about 0.7 meter by 2100. This, Hansen thinks, was a serious error. He argues there is a major risk that sea levels will raise by several meters this century, and that sea levels have been raising rapidly during the last 20 years. That should be easily checked against recorded data, but it is not that simple. Most oceanographers believe that sea levels are rising at about seven inches per century, and continue to do so. However, tide gauges in the US show considerable variation because some land areas are rising and some are sinking. For example, over the past 100 years, the rate of sea level rise varies from about an increase of 0.36 inches (9.1 mm) per year along the Louisiana Coast (due to land sinking), to a drop of a few inches per decade in parts of Alaska, where the land is rising. Further adding to the confusion is the fact that sea levels are not level. There are steady currents in the ocean, driven by winds and atmospheric heating and cooling, which give rise to differences in sea level around the world. For example, the Atlantic Ocean north of the Gulf Stream is about 1 meter lower than further south, and the Atlantic as a whole is about 40 cm lower than the Pacific. There is even a sea level difference of about 20 cm across the Panama Canal.
What I really dislike about James Hansen is his relentless and vicious attacks on anyone who disagree with his climate models. Hansen has helped build global warming into a trillion dollar industry, and he is quite willing to do whatever it takes to destroy those he consider enemies. That includes personal attacks, pressuring publishers to reject research papers that disagree with his conclusions, and even getting people fired when they disagree with him. One of those who felt his wrath was Syun-Ichi Akasofu, of the International Arctic Research Center at the University of Alaska. Akasofu points out that the connection between CO2 and global warming has not been proven, and suggests that a raise in global temperature of one degree Fahrenheit in the past 100 years is not alarming, and that most of that raise in temperature is due to the Earth recovering from the “Little Ice Age.” Other scientists vilified by Hansen include Willie Soon and Sallie Baliunas from the Harvard-Smithsonian center for Astrophysics. When their paper, proving that the Medieval Warm Period and the Little Ice Age were climatic anomalies felt worldwide was published, Hansen got the editor of Climate Science fired. Finally, Hansen’s relentless attacks on William Gray, Professor Emeritus of Atmospheric Science at Colorado State University, lost the professor his grant for predicting the number and severity of hurricanes in the Atlantic. William Gray had the temerity to point out that there were more named hurricanes during the first half of the 20th century than in the second half.
Gråulf.
There are problems with Hansen’s hockey stick graph. To demonstrate his assertion, that current global temperatures are hotter and more sudden than temperature fluctuations in the past, he minimizes the Medieval Warm period and the subsequent Little Ice Age. No one knows for sure where Hansen got his data, since he has refused to reveal his methodology, but speculation is that he used tree ring data. That is problematic, since the size of tree rings can vary due to many factors, such as precipitation, temperature, and forest density, and therefore needs to be compared to other factors to be of any use. The very fact that Hansen has not revealed his methodology should invalidate his hockey stick graph, because that makes it impossible to duplicate his research. Then a team of Canadian mathematicians discovered that Hansen made mistakes when he entered his date into his climate model, so that the outcome would always result in a “hockey stick” no matter what data was entered. Despite the problems with the hockey stick, the Intergovernmental Panel on Climate Change (IPCC) continues to use it.
One of the key tenets of the global warming alarmists is that nine of the ten warmest years on record have occurred since 1995, and the warmest year on record was 1998. A scientist by the name of Stephen McIntyre recently notified NASA that there were errors in their temperature records, and NASA quietly corrected their date to show that actually four of the hottest years on record occurred in the 1930’s, and the hottest year on record was in 1934. No one knows if the temperature records were deliberately tampered with, but the correction to the data has not changed the claim by global warming alarmists that this decade is the warmest on record. An interesting aside to the discussion about temperature records is some research done by retired TV meteorologist Anthony Watts. The U.S. Weather Service specified in 1889 that the shelters for weather instruments had to be whitewashed. Then, in the late 1970s the Weather Service changed that specification to white latex paint. Mr. Watts built three instrument shelters. One was white washed, one painted with white latex paint, and the third was left unpainted. He measured temperatures for several months, but typical among his results was one day in August when he found that the instruments in the unpainted shelter registered a maximum daytime temperature of 98.47 degrees, the instruments in the latex painted shelter showed 97.74 degrees, and the whitewashed shelter measured 96.94 degrees. In short, the change from white wash to latex paint caused almost a one-degree increase in the average temperature of the U.S.
James Hansen was one of the main authors of the IPCC’s report on global warming, but he disagrees with the reports estimate that the sea level rise will be about 0.7 meter by 2100. This, Hansen thinks, was a serious error. He argues there is a major risk that sea levels will raise by several meters this century, and that sea levels have been raising rapidly during the last 20 years. That should be easily checked against recorded data, but it is not that simple. Most oceanographers believe that sea levels are rising at about seven inches per century, and continue to do so. However, tide gauges in the US show considerable variation because some land areas are rising and some are sinking. For example, over the past 100 years, the rate of sea level rise varies from about an increase of 0.36 inches (9.1 mm) per year along the Louisiana Coast (due to land sinking), to a drop of a few inches per decade in parts of Alaska, where the land is rising. Further adding to the confusion is the fact that sea levels are not level. There are steady currents in the ocean, driven by winds and atmospheric heating and cooling, which give rise to differences in sea level around the world. For example, the Atlantic Ocean north of the Gulf Stream is about 1 meter lower than further south, and the Atlantic as a whole is about 40 cm lower than the Pacific. There is even a sea level difference of about 20 cm across the Panama Canal.
What I really dislike about James Hansen is his relentless and vicious attacks on anyone who disagree with his climate models. Hansen has helped build global warming into a trillion dollar industry, and he is quite willing to do whatever it takes to destroy those he consider enemies. That includes personal attacks, pressuring publishers to reject research papers that disagree with his conclusions, and even getting people fired when they disagree with him. One of those who felt his wrath was Syun-Ichi Akasofu, of the International Arctic Research Center at the University of Alaska. Akasofu points out that the connection between CO2 and global warming has not been proven, and suggests that a raise in global temperature of one degree Fahrenheit in the past 100 years is not alarming, and that most of that raise in temperature is due to the Earth recovering from the “Little Ice Age.” Other scientists vilified by Hansen include Willie Soon and Sallie Baliunas from the Harvard-Smithsonian center for Astrophysics. When their paper, proving that the Medieval Warm Period and the Little Ice Age were climatic anomalies felt worldwide was published, Hansen got the editor of Climate Science fired. Finally, Hansen’s relentless attacks on William Gray, Professor Emeritus of Atmospheric Science at Colorado State University, lost the professor his grant for predicting the number and severity of hurricanes in the Atlantic. William Gray had the temerity to point out that there were more named hurricanes during the first half of the 20th century than in the second half.
Gråulf.
Monday, April 27, 2009
Technology (Part Four) - by Moses
C.R.T.C- The Canadian Radio and Television Commission
Every phone call is taped, every single call. Monitoring is usually conducted by at least 7 countries, on every single phone call. Believe it, or not, actions are not made unilaterally. The C.R.T.C and F.C.C (Federal Communication Commission) are around trying to unify every American, pronounced on our Radio’s, Television’s, and Telephone’s.
On long distance phone calls, arrangements are made as to who are able to monitor the call. For example, on an ‘Ole’ Phone Card, they may have sublet the line to Spain, Portugal, Mexico, Brazil, Argentina, Chile, and Angolan telecommunication specialists for monitoring purposes. These specialists interpret and transpose the conversation, each body independent of each-other. Likewise a Bell Calling Card sublet to Canada, U.S.A, Ethiopia, Sweden, Israel, Saudi Arabia and China. Here in lye’s the problem with this issue. Should I be unaware of how I’m being interpreted on a phone call (based on telecommunication analysis), who is going to G-d playing Judge and jury with respect to my Soul, Culture, Freedoms, Cause and Effects, and Consequences regarding having a chat that begins ‘How are you’.
The telephone call is the line of communication. What about the phone itself, the satellite, the Relay Station….? (This is why I use a secure Digital Workstation that has no ties into the internet or any other computer for that matter).
Should my phone have exclusive rights to Canada, and the call is monitored by Canada, Sweden, Ethiopia, Iceland, Israel, and Saudi Arabia, does the Canadian Government really have ultimate say as to how the call was perceived (interpreted), after the fact? Reference - ‘Enemy of the State’ – ‘Who Monitors the Monitors?’
Abstract Philosophy
People – Tribe – Nation – Satellite (monitor) – communication – district – city – country – continent – planet (earth)
Principles
1. Each house is a nation, hence ‘nuclear families’, landlords. Nations are made up of people, some refer to as G-ds / Role models / Idols.
2. To monitor each telephone / computer (any communication device for that matter), there is an area code followed by an exchange, followed by a 4-digit identification code. In Ground telecommunications this holds true. In cellular telecommunications there is also a chip that stores the location (the frequency), and, transmits to a relay station the aforementioned data. In Satellite telecommunications, the technology is more sophisticated and a person can be tracked simply by answering their phone, as there is a G.P.S (Ground Frequency) tracking device in the phone itself. Likewise an independent work-station (commonly referred to as consulting) is a search engine upon copyright.
3. To access these stations through a Local Area Network or Wide Area Network, software applications are used - commonly referred to as the intranet (in the Private Sector), and the internet (in the Public sector).
4. Amplitude Modulation (A.M) –White Noise (Equal energy per frequency), Pink Noise (Equal energy per octave). Frequency Modulation (F.M) – altering time, commonly referred to as Bandwidth.
5. Frequencies that are defined (a specific series of events over an exact moment) – motion/in-motion, compression/rarefaction also defined as Hertz (the scientist who discovered this premise). Where do they go? Within each universe there are many galaxies. We are grounded to prevent dementia (exploration outside the known Universe), this is our depth perception, also known as Leagues (a Naval term), Co-Ordinates (X Y Z) – Space and time travel.
Capacitors and Current Flow (A/C-D/C) and Ohm’s Law
A capacitor stores energy. It is measured in Wattage and at the release point it becomes an amplifier. For example, a light bulb takes in 60 watts (or 100 watts for that matter). The Amps (measured in Wattage) from the light socket is varied depending on the design. A/C is alternating current (invented by Tesla and developed by Marconi) and is measured at 60Hz for the purpose of safety regulations in North American standards. D/C is direct current (developed by Watts, Alexander Graham Bell, and Benjamin Franklin) and is measured at 110Hz for the purpose of safety regulations in European standards. Once the current leaves the capacitor it is necessary to transfer it through resisters for safety purposes as the feed is dangerous to take in. There are elements of tolerance used in a resister to prevent accidental electrocution. i.e. – Shocker (a movie depicting a man who is so power hungry that he feeds off other peoples’ energy rendering them powerless). Examples of tolerance are Bronze, Silver, Gold, Platinum – Copper (the least finite/greatest amount of tolerance), and Crystal (the greatest finite/least amount of tolerance). Tolerance is also referred to as margin or Q (trajectory). A capacitor is capable of distributing the current into groupings in the Billions, Trillions, Zillions, Maxi-Zillions, and Terra-Zillions. This is why sample and hold circuitry is used in high performance professional application software, so as to prevent a computer from crashing – these are called chips – also known as silicone, invented by the scientist Farad.
In Ohm’s law Voltage = Amps x Resistance. Voltage is synonymous with current (electricity), resistance is impedance (also known as tolerance and measured in Ohms), and, amps are also referred to as watts, or power. Therefore the Power Law (Watts Law) is Amps = Voltage/ Impedance. Distance = Hertz x Voltage/Impedance (also known as time x speed).
Every phone call is taped, every single call. Monitoring is usually conducted by at least 7 countries, on every single phone call. Believe it, or not, actions are not made unilaterally. The C.R.T.C and F.C.C (Federal Communication Commission) are around trying to unify every American, pronounced on our Radio’s, Television’s, and Telephone’s.
On long distance phone calls, arrangements are made as to who are able to monitor the call. For example, on an ‘Ole’ Phone Card, they may have sublet the line to Spain, Portugal, Mexico, Brazil, Argentina, Chile, and Angolan telecommunication specialists for monitoring purposes. These specialists interpret and transpose the conversation, each body independent of each-other. Likewise a Bell Calling Card sublet to Canada, U.S.A, Ethiopia, Sweden, Israel, Saudi Arabia and China. Here in lye’s the problem with this issue. Should I be unaware of how I’m being interpreted on a phone call (based on telecommunication analysis), who is going to G-d playing Judge and jury with respect to my Soul, Culture, Freedoms, Cause and Effects, and Consequences regarding having a chat that begins ‘How are you’.
The telephone call is the line of communication. What about the phone itself, the satellite, the Relay Station….? (This is why I use a secure Digital Workstation that has no ties into the internet or any other computer for that matter).
Should my phone have exclusive rights to Canada, and the call is monitored by Canada, Sweden, Ethiopia, Iceland, Israel, and Saudi Arabia, does the Canadian Government really have ultimate say as to how the call was perceived (interpreted), after the fact? Reference - ‘Enemy of the State’ – ‘Who Monitors the Monitors?’
Abstract Philosophy
People – Tribe – Nation – Satellite (monitor) – communication – district – city – country – continent – planet (earth)
Principles
1. Each house is a nation, hence ‘nuclear families’, landlords. Nations are made up of people, some refer to as G-ds / Role models / Idols.
2. To monitor each telephone / computer (any communication device for that matter), there is an area code followed by an exchange, followed by a 4-digit identification code. In Ground telecommunications this holds true. In cellular telecommunications there is also a chip that stores the location (the frequency), and, transmits to a relay station the aforementioned data. In Satellite telecommunications, the technology is more sophisticated and a person can be tracked simply by answering their phone, as there is a G.P.S (Ground Frequency) tracking device in the phone itself. Likewise an independent work-station (commonly referred to as consulting) is a search engine upon copyright.
3. To access these stations through a Local Area Network or Wide Area Network, software applications are used - commonly referred to as the intranet (in the Private Sector), and the internet (in the Public sector).
4. Amplitude Modulation (A.M) –White Noise (Equal energy per frequency), Pink Noise (Equal energy per octave). Frequency Modulation (F.M) – altering time, commonly referred to as Bandwidth.
5. Frequencies that are defined (a specific series of events over an exact moment) – motion/in-motion, compression/rarefaction also defined as Hertz (the scientist who discovered this premise). Where do they go? Within each universe there are many galaxies. We are grounded to prevent dementia (exploration outside the known Universe), this is our depth perception, also known as Leagues (a Naval term), Co-Ordinates (X Y Z) – Space and time travel.
Capacitors and Current Flow (A/C-D/C) and Ohm’s Law
A capacitor stores energy. It is measured in Wattage and at the release point it becomes an amplifier. For example, a light bulb takes in 60 watts (or 100 watts for that matter). The Amps (measured in Wattage) from the light socket is varied depending on the design. A/C is alternating current (invented by Tesla and developed by Marconi) and is measured at 60Hz for the purpose of safety regulations in North American standards. D/C is direct current (developed by Watts, Alexander Graham Bell, and Benjamin Franklin) and is measured at 110Hz for the purpose of safety regulations in European standards. Once the current leaves the capacitor it is necessary to transfer it through resisters for safety purposes as the feed is dangerous to take in. There are elements of tolerance used in a resister to prevent accidental electrocution. i.e. – Shocker (a movie depicting a man who is so power hungry that he feeds off other peoples’ energy rendering them powerless). Examples of tolerance are Bronze, Silver, Gold, Platinum – Copper (the least finite/greatest amount of tolerance), and Crystal (the greatest finite/least amount of tolerance). Tolerance is also referred to as margin or Q (trajectory). A capacitor is capable of distributing the current into groupings in the Billions, Trillions, Zillions, Maxi-Zillions, and Terra-Zillions. This is why sample and hold circuitry is used in high performance professional application software, so as to prevent a computer from crashing – these are called chips – also known as silicone, invented by the scientist Farad.
In Ohm’s law Voltage = Amps x Resistance. Voltage is synonymous with current (electricity), resistance is impedance (also known as tolerance and measured in Ohms), and, amps are also referred to as watts, or power. Therefore the Power Law (Watts Law) is Amps = Voltage/ Impedance. Distance = Hertz x Voltage/Impedance (also known as time x speed).
Saturday, April 25, 2009
The Global Warming Hoax by Gråulf
I was planning to work on my motorcycle today, but it is too cold to work out in the garage, so I am inside, with the furnace on, writing this.
I didn’t pay much attention to the Global Warming talk until a couple of years ago. There have always been gloom and doom predictions. In the old days people thought natural catastrophes were Gods punishment for their sins. Then pollution became the popular cause. I vividly remember listening to a scientist from the University of Colorado pontificating that we had reached a critical tipping point, and that human life on earth would be snuffed out within 10 years. Back in the seventies the talk was all about the impending ice age. At the same time, some people were obsessed by the possibility of a nuclear winter, and if that didn’t wipe us out, overpopulation would. Then the imminent danger became global warming. It seemed sort of silly to me, because Colorado was always hot in the summer, and people wear big hats because the summer sun can fry your brain without one. A degree one way or the other would not make any difference.
When All Gore’s movie came out I began to take global warming seriously, because I realized this was not just an idiot fringe belief. Friends of mine came away from “An Inconvenient Truth” frightened by what we are doing to the climate, and terrified by Gore’s dire predictions of the consequences. Normally I would not pay money to go and see anything by All Gore, but I finally did see the movie, and it was about what I expected. The high point was when All Gore hoisted his fat ass up to the ceiling to point out how much hotter it is now that it has been in the past million years. I am an amateur historian, and I knew he was lying, because it was warmer during the Viking Age when Viking settlers were able to grow grain on Greenland, and you can’t grow grain there today. That lie made me wonder what else Gore lied about, and got me started on what was to become a two-year research project.
All Gore claims there is worldwide consensus among climatologists that today’s global warming is caused by our CO2 pollution, and that the debate is over. To prove his point he shows belching factory chimneys, followed by a litany of dried up farmland, forest fires, hurricanes, floods, melting glaciers, and drowning polar bears. Gore is a little vague on exactly how the belching factory chimneys are causing this calamity. When I began reading peer-review articles on climate change I assumed I would find scientific proof that CO2 is responsible for the present global warming, but all I found were predictions based on computer models, and the fact that both CO2 and the earths temperature has risen during the past one hundred years. It is not surprising that global temperatures are rising, since we are coming out of “the Little Ice Age”, but there is no proof whatsoever that the raise in CO2 in the atmosphere is responsible for more than a small part of the raise in temperatures. However, there are indications to the contrary. During the 1930es and 1940es it was as warm as it is today, and then the temperature cooled for twenty years, while the CO2 levels continued to rise. That is happening again today. There has been no increase in global warming for the past ten years, and the global temperatures has fallen by almost one degree during the last two years, and again, CO2 levels continue to increase. The claim that ice cores prove CO2 is responsible for global warming is also false, because the ice cores confirm definitively that the temperature rose well before there was a rise in CO2. What I find significant is that the computer models did not predict, and cannot explain, the drop in global temperature that is taking place now. All Gore’s claim of consensus among climatologists is far fetched as well. Lots of climatologists do not agree that CO2 is responsible for significant global warming, and a lot of scientists are changing sides due to the cooling taking place now. Russian scientists have never agreed that global warming is caused by CO2, and now a majority of Japanese and New Zealand climatologists are speaking out against that theory as well.
CO2 constitute only .035% of the earth’s atmosphere, and man-made CO2 is 3.225% of that total. To reduce modern climate change to one variable, CO2, or a small proportion of one variable - human-induced CO2 - is not science. It is also contrary to how CO2 behaves in the atmosphere. CO2 is not a good conductor of heat, as it only absorbs heat in a narrow band of the ultraviolet spectrum. Global warming advocates explain their extraordinary claim of CO2’s disproportionate role in global warming by inventing a feed-back loop that has CO2 reradiate heat back at the earth, thereby causing further warming. That is contrary to everything we know of heat transfer. Heat radiates towards anything that is colder, never towards anything that is warmer. Even if the claimed CO2 feedback loop was possible, only 0.28% of the so-called greenhouse effect is due human activity.
As I was reading hundreds of peer-review papers on climate change I tried to engage people in conversation about global warming. Most don’t know anything about CO2, or how the climate system works, but they know that the debate is over, and that man-made global warming is responsible for all the cute polar bears drowning. Global warming is like religion. It has to be taken on faith rather than knowledge. One professor at the University of Colorado was outraged that I questioned the climate change computer models, and told me that I should read some of the books in the library at the National Institute for Atmospheric Research. That is like being told to go to the Vatican to ask if Christianity is real. The institute is the national center for computer modelers, and most of it’s funding is based on keeping the faith.
CO2 is an invisible, odorless, and harmless trace gas in the atmosphere, that plants need to survive, and now the Obama administration is about to classify it a pollutant so they can tax it. Obama’s tax-and-trade scheme is based on a theory that is unproven, and beyond crazy, and will drive what little manufacturing we have left out of the country. It will also enable government to regulate every part of our lives.
Gråulf.
I didn’t pay much attention to the Global Warming talk until a couple of years ago. There have always been gloom and doom predictions. In the old days people thought natural catastrophes were Gods punishment for their sins. Then pollution became the popular cause. I vividly remember listening to a scientist from the University of Colorado pontificating that we had reached a critical tipping point, and that human life on earth would be snuffed out within 10 years. Back in the seventies the talk was all about the impending ice age. At the same time, some people were obsessed by the possibility of a nuclear winter, and if that didn’t wipe us out, overpopulation would. Then the imminent danger became global warming. It seemed sort of silly to me, because Colorado was always hot in the summer, and people wear big hats because the summer sun can fry your brain without one. A degree one way or the other would not make any difference.
When All Gore’s movie came out I began to take global warming seriously, because I realized this was not just an idiot fringe belief. Friends of mine came away from “An Inconvenient Truth” frightened by what we are doing to the climate, and terrified by Gore’s dire predictions of the consequences. Normally I would not pay money to go and see anything by All Gore, but I finally did see the movie, and it was about what I expected. The high point was when All Gore hoisted his fat ass up to the ceiling to point out how much hotter it is now that it has been in the past million years. I am an amateur historian, and I knew he was lying, because it was warmer during the Viking Age when Viking settlers were able to grow grain on Greenland, and you can’t grow grain there today. That lie made me wonder what else Gore lied about, and got me started on what was to become a two-year research project.
All Gore claims there is worldwide consensus among climatologists that today’s global warming is caused by our CO2 pollution, and that the debate is over. To prove his point he shows belching factory chimneys, followed by a litany of dried up farmland, forest fires, hurricanes, floods, melting glaciers, and drowning polar bears. Gore is a little vague on exactly how the belching factory chimneys are causing this calamity. When I began reading peer-review articles on climate change I assumed I would find scientific proof that CO2 is responsible for the present global warming, but all I found were predictions based on computer models, and the fact that both CO2 and the earths temperature has risen during the past one hundred years. It is not surprising that global temperatures are rising, since we are coming out of “the Little Ice Age”, but there is no proof whatsoever that the raise in CO2 in the atmosphere is responsible for more than a small part of the raise in temperatures. However, there are indications to the contrary. During the 1930es and 1940es it was as warm as it is today, and then the temperature cooled for twenty years, while the CO2 levels continued to rise. That is happening again today. There has been no increase in global warming for the past ten years, and the global temperatures has fallen by almost one degree during the last two years, and again, CO2 levels continue to increase. The claim that ice cores prove CO2 is responsible for global warming is also false, because the ice cores confirm definitively that the temperature rose well before there was a rise in CO2. What I find significant is that the computer models did not predict, and cannot explain, the drop in global temperature that is taking place now. All Gore’s claim of consensus among climatologists is far fetched as well. Lots of climatologists do not agree that CO2 is responsible for significant global warming, and a lot of scientists are changing sides due to the cooling taking place now. Russian scientists have never agreed that global warming is caused by CO2, and now a majority of Japanese and New Zealand climatologists are speaking out against that theory as well.
CO2 constitute only .035% of the earth’s atmosphere, and man-made CO2 is 3.225% of that total. To reduce modern climate change to one variable, CO2, or a small proportion of one variable - human-induced CO2 - is not science. It is also contrary to how CO2 behaves in the atmosphere. CO2 is not a good conductor of heat, as it only absorbs heat in a narrow band of the ultraviolet spectrum. Global warming advocates explain their extraordinary claim of CO2’s disproportionate role in global warming by inventing a feed-back loop that has CO2 reradiate heat back at the earth, thereby causing further warming. That is contrary to everything we know of heat transfer. Heat radiates towards anything that is colder, never towards anything that is warmer. Even if the claimed CO2 feedback loop was possible, only 0.28% of the so-called greenhouse effect is due human activity.
As I was reading hundreds of peer-review papers on climate change I tried to engage people in conversation about global warming. Most don’t know anything about CO2, or how the climate system works, but they know that the debate is over, and that man-made global warming is responsible for all the cute polar bears drowning. Global warming is like religion. It has to be taken on faith rather than knowledge. One professor at the University of Colorado was outraged that I questioned the climate change computer models, and told me that I should read some of the books in the library at the National Institute for Atmospheric Research. That is like being told to go to the Vatican to ask if Christianity is real. The institute is the national center for computer modelers, and most of it’s funding is based on keeping the faith.
CO2 is an invisible, odorless, and harmless trace gas in the atmosphere, that plants need to survive, and now the Obama administration is about to classify it a pollutant so they can tax it. Obama’s tax-and-trade scheme is based on a theory that is unproven, and beyond crazy, and will drive what little manufacturing we have left out of the country. It will also enable government to regulate every part of our lives.
Gråulf.
Saturday, April 18, 2009
Pissed off Americans by Gråulf
A while ago I heard that police departments around the country had lists of things to help them identify potentially dangerous people. One of the things they look for are people with NRA stickers, and/or American flags on their cars. The stories seemed too silly to be true. Then, on April 7, 2009 the Department of Homeland Security declassified a report on the “Top Threats” to American security titled, “Rightwing Extremism”. Imagine my surprise at learning that I am a Right Wing Extremist because I own guns, and I am a lifetime member of the NRA; I wear an American flag in my lapel, and fly an American flag on national holidays; I don’t believe that Islam is a “Religion of Peace”; I believe illegal immigration should be stopped, and those who are here illegally should be sent back to wherever they came from; I favor the death penalty for brutal killers; and I think that Nancy Pelosi is dumber than a box of rocks.
What really angered a lot of people is that the Security Report identified returning American soldiers as being especially dangerous, because they have been trained to violence, and some of them are unstable individuals. This evaluation is based solely on a study of returning soldiers that conclude they are more violent, and more prone to violence and suicide than the general population. The study is flawed and fraudulent, since it measures returning soldiers against the general population. When returning soldiers are measured against people of the same age group there is no difference in the rate of violent behavior.
Many of those who voted for Obama should have paid attention when he made disparaging remarks about “rednecks” who cling to their bibles and their guns when times are hard. He is coming after them now, and according to recent statements made by Pelosi, he is soon coming after their guns.
Democrats, and the major news services, discount the April 15th “Tea Party” demonstrations as racist. What they ignore is that never before have there been major demonstrations against a president during his first 100 days in office. You don’t have to be racist to resent a president who spends your tax money like a drunken sailor on shore leave; you are not a radical for not appreciating being lied to about the pork in the stimulus bill and the budget; you are not an extremist for resenting Obama’s “America Stinks” tour of Europe; and you are not a radical for being offended at the president of the USA almost kissing the Saudi kings brown ass. You are just a pissed off American. Wait till you see the 4th of July demonstrations.
Gråulf.
What really angered a lot of people is that the Security Report identified returning American soldiers as being especially dangerous, because they have been trained to violence, and some of them are unstable individuals. This evaluation is based solely on a study of returning soldiers that conclude they are more violent, and more prone to violence and suicide than the general population. The study is flawed and fraudulent, since it measures returning soldiers against the general population. When returning soldiers are measured against people of the same age group there is no difference in the rate of violent behavior.
Many of those who voted for Obama should have paid attention when he made disparaging remarks about “rednecks” who cling to their bibles and their guns when times are hard. He is coming after them now, and according to recent statements made by Pelosi, he is soon coming after their guns.
Democrats, and the major news services, discount the April 15th “Tea Party” demonstrations as racist. What they ignore is that never before have there been major demonstrations against a president during his first 100 days in office. You don’t have to be racist to resent a president who spends your tax money like a drunken sailor on shore leave; you are not a radical for not appreciating being lied to about the pork in the stimulus bill and the budget; you are not an extremist for resenting Obama’s “America Stinks” tour of Europe; and you are not a radical for being offended at the president of the USA almost kissing the Saudi kings brown ass. You are just a pissed off American. Wait till you see the 4th of July demonstrations.
Gråulf.
Tuesday, April 14, 2009
Technology (Part Three) - By Moses
Communications and Code
Communications
In today’s communications Cellular technology promotes Cell Networks. Satellite promotes the Globalization of communication. Accountability. The U.S.A wants it both ways. I would back satellite for the simple reason that relay stations are not employed in satellite technology as if they are it is considered industrial espionage. A relay station interprets differently from one station to the next. Perception is changed promoting Schizophrenia. One Satellite (backed up by Ground Hub) used by the World just means that it’s an even playing field. It does not mean that every phone is connected together at the same exact time. All it means is that you’re accessing the same monitor. You can’t monitor 1 billion phone calls at the same time.
Code
Phones that operate at a standardized frequency allows for a person to know that they’re grounded. Implementing relay station to relay station is changing perception so many times over that a person has no sense of where the signal is going, therefore a person subconsciously mistrusts.
Something was said. Is it true? What was said? In what context was it said? Do you believe the person? Did they look left? What is the root issue? Did the person convict the other person? How was it said? Were both avenues shot down? These are all measures of a persons’ ability to communicate.
To communicate effectively a person has to be honest. Once a person is caught in a lie, their credibility is in question.
You never take what someone says to be a lie. Once you do you become gullible. This is a term used to express a persons’ inability to communicate effectively.
Once you’ve decided what it is you believe you never give up that idealism. Straying too often and your perception becomes’ altered, and this is misinformation.
Spotting it and a persons’ opinion of another is influenced.
Pharmaceuticals were introduced as a way of combating misinformation, so as to not change a person into becoming a controlled entity. Emancipation is the cleansing of naivety – to see things straight – To do this is a Monotheist practice. Visions are promoted in this context and we become down to earth.
Any exchange of information carries 7 properties;
1) Truth, 2) Lie, 3) Half-Truth, 4) Argument, 5) Presentation, 6) Perception, 7) Deduction
Once the information has gone through these 7 properties it becomes a matter as to who carries similar perspectives. You associate with these people. To associate with a person that is argumentative towards a deduction that they believe in means that they won’t back you. Always shut your mouth at this point as they will do anything to shoot you down. This is called character assassination.
Raising spirits encourages a person to remember their past, to be successful in the present, and affect change for the future. By the way, should you be watching a Hockey game on television a person thinks to affect the present, when in actual fact they are ruminating about the past, as there is a 4 second tape delay from the source to the monitor. This affects a persons’ persona, shaping a persons’ conscience.
Memory and Memory Loss
Memory Loss
Frequencies (UHF/VHF) – Filtering – Visual – Triggers – Interface
Memory loss occurs when;
1) There’s too much data
2) Triggers occur too fast
3) Lost in translation of the interface (i.e. Conversion ratios become exponential)
4) Prophecy
Memory
Accessing information (picture) from the right brain hemisphere is called a ‘photographic memory’. Should this cease we lose our train of thought, and this is memory loss. Short term memory loss is when your cerebral cortex (in our right brain hemisphere) is misfiring at an expediated rate. For the rate to be too high, the transfer of information between right brain-hemisphere to the left brain-hemisphere is corrupted. By losing our ability to have memory, breakdown of affinity occurs. This is memory loss. Long term memory loss, in extreme, is amnesia. Memory loss in extreme severity is alzeimer’s and is usually when someone suffers permanent short term memory loss.
When a person loses the ability to have a continuance of imagination and is controlled, they lose their affinity and this is synonomous to losing their train of thought. Once someone enters psycho-somatic state (sub-conscious) without continuance it leads to night terror (nightmares). This is when your affinity (epiphanies) is boosted to maintain memory. E.C.T is the final treatment in combating accelerated rate, as an accelerated (or too slow) a rate leads to disorientation (confusion).
A computer carries with it firewalls to counter a misfiring of information. Once these firewalls are broken, memory in continuance in Random Operating Memory can be altered. This is permanent memory loss, or amnesia.
Analog – Representation of the human mind, body, and spirit
Search engines’ on the internet take the digital representation of the Analog domain (representation of representation essentially) by using Analog to Digital converters, Sample and hold circuitry, and Digital to Analog converters. This makes a person a computer character (a cyborg). Fiber-optics transfers this man-made character from communication device to communication device. By using the integer 0, or absolute zero (-273 degrees Centigrade, or -454 degrees Fahrenheit) – a chronological, chrynological state, the image is transferred through an input/output module. At this point the signal is quantified, interpolated, translated, encrypted, translated once again, defined – also known as the Gist. In ultimate terms this is considered an analogy or Legend. Here in lye’s the problem. A software is controlled by firewalls, essentially Centurion – rules (some have speculated to be the Matrix). This is why we have Interfaces, incubators for our babies to condition our young to be aware of reality. Industrial Espionage sometimes occurs and this is what we see in terms of politics (political science). Market Research (Market Sharing), National Security (Intelligence Community), and accessing our Cerebral Cortex – our imagination (affinity to the Lord), allowing us to tell the truth.
Communications
In today’s communications Cellular technology promotes Cell Networks. Satellite promotes the Globalization of communication. Accountability. The U.S.A wants it both ways. I would back satellite for the simple reason that relay stations are not employed in satellite technology as if they are it is considered industrial espionage. A relay station interprets differently from one station to the next. Perception is changed promoting Schizophrenia. One Satellite (backed up by Ground Hub) used by the World just means that it’s an even playing field. It does not mean that every phone is connected together at the same exact time. All it means is that you’re accessing the same monitor. You can’t monitor 1 billion phone calls at the same time.
Code
Phones that operate at a standardized frequency allows for a person to know that they’re grounded. Implementing relay station to relay station is changing perception so many times over that a person has no sense of where the signal is going, therefore a person subconsciously mistrusts.
Something was said. Is it true? What was said? In what context was it said? Do you believe the person? Did they look left? What is the root issue? Did the person convict the other person? How was it said? Were both avenues shot down? These are all measures of a persons’ ability to communicate.
To communicate effectively a person has to be honest. Once a person is caught in a lie, their credibility is in question.
You never take what someone says to be a lie. Once you do you become gullible. This is a term used to express a persons’ inability to communicate effectively.
Once you’ve decided what it is you believe you never give up that idealism. Straying too often and your perception becomes’ altered, and this is misinformation.
Spotting it and a persons’ opinion of another is influenced.
Pharmaceuticals were introduced as a way of combating misinformation, so as to not change a person into becoming a controlled entity. Emancipation is the cleansing of naivety – to see things straight – To do this is a Monotheist practice. Visions are promoted in this context and we become down to earth.
Any exchange of information carries 7 properties;
1) Truth, 2) Lie, 3) Half-Truth, 4) Argument, 5) Presentation, 6) Perception, 7) Deduction
Once the information has gone through these 7 properties it becomes a matter as to who carries similar perspectives. You associate with these people. To associate with a person that is argumentative towards a deduction that they believe in means that they won’t back you. Always shut your mouth at this point as they will do anything to shoot you down. This is called character assassination.
Raising spirits encourages a person to remember their past, to be successful in the present, and affect change for the future. By the way, should you be watching a Hockey game on television a person thinks to affect the present, when in actual fact they are ruminating about the past, as there is a 4 second tape delay from the source to the monitor. This affects a persons’ persona, shaping a persons’ conscience.
Memory and Memory Loss
Memory Loss
Frequencies (UHF/VHF) – Filtering – Visual – Triggers – Interface
Memory loss occurs when;
1) There’s too much data
2) Triggers occur too fast
3) Lost in translation of the interface (i.e. Conversion ratios become exponential)
4) Prophecy
Memory
Accessing information (picture) from the right brain hemisphere is called a ‘photographic memory’. Should this cease we lose our train of thought, and this is memory loss. Short term memory loss is when your cerebral cortex (in our right brain hemisphere) is misfiring at an expediated rate. For the rate to be too high, the transfer of information between right brain-hemisphere to the left brain-hemisphere is corrupted. By losing our ability to have memory, breakdown of affinity occurs. This is memory loss. Long term memory loss, in extreme, is amnesia. Memory loss in extreme severity is alzeimer’s and is usually when someone suffers permanent short term memory loss.
When a person loses the ability to have a continuance of imagination and is controlled, they lose their affinity and this is synonomous to losing their train of thought. Once someone enters psycho-somatic state (sub-conscious) without continuance it leads to night terror (nightmares). This is when your affinity (epiphanies) is boosted to maintain memory. E.C.T is the final treatment in combating accelerated rate, as an accelerated (or too slow) a rate leads to disorientation (confusion).
A computer carries with it firewalls to counter a misfiring of information. Once these firewalls are broken, memory in continuance in Random Operating Memory can be altered. This is permanent memory loss, or amnesia.
Analog – Representation of the human mind, body, and spirit
Search engines’ on the internet take the digital representation of the Analog domain (representation of representation essentially) by using Analog to Digital converters, Sample and hold circuitry, and Digital to Analog converters. This makes a person a computer character (a cyborg). Fiber-optics transfers this man-made character from communication device to communication device. By using the integer 0, or absolute zero (-273 degrees Centigrade, or -454 degrees Fahrenheit) – a chronological, chrynological state, the image is transferred through an input/output module. At this point the signal is quantified, interpolated, translated, encrypted, translated once again, defined – also known as the Gist. In ultimate terms this is considered an analogy or Legend. Here in lye’s the problem. A software is controlled by firewalls, essentially Centurion – rules (some have speculated to be the Matrix). This is why we have Interfaces, incubators for our babies to condition our young to be aware of reality. Industrial Espionage sometimes occurs and this is what we see in terms of politics (political science). Market Research (Market Sharing), National Security (Intelligence Community), and accessing our Cerebral Cortex – our imagination (affinity to the Lord), allowing us to tell the truth.
Friday, April 10, 2009
The Somalia Pirates by Gråulf
As you know, I have been following the pirate situation around the horn of Africa for a long time because I find it so implausible that pirate attacks are tolerated in this day and age. There are 12 to 15 international war ships patrolling the area, but they are tied hand and foot by international law and have been unable to stop the piracy.
There is a resolution (1838, passed in October) which authorizes the use of "necessary means", meaning force if need be, to stop piracy in international waters. There is also another resolution (1816), which allows anti-pirate operations within Somali waters, but only with the agreement of the Somali transitional government.
But even all these operations have to be conducted within international law, defined in this case as the provisions of the UN Law of the Sea Convention.
There has also been a legal opinion by the Foreign Office in London that captured pirates cannot necessarily be sent back to whatever authorities can be found in Somalia, in case they are subject to harsh treatment. That would contravene the British Human Rights Act. The pirates captured in the Royal Navy action have now been handed over not to Somalia, but Kenya.
The Law of the Sea Convention places limitations on daring action. Under Article 110 of the convention a warship has first to send an officer-led party to board a suspected pirate ship to verify any suspicions.
The warship cannot just open fire. Any inspection has to be carried out "with all possible consideration". That sounds rather tentative, and totally nuts. I also understand that you cannot under international law convert a commercial ship into a kind of warship. The issue of who will put pirates in trial is a legal minefield as well, and has yet to be resolved. I suggested to my daughter Dana that this should be something the International Court should take up, but she tells me that the court only has jurisdiction over signatory countries, and no jurisdiction over anything happening in international waters. So, what the Hell good is it.
Now the pirates have captured an American ship (actually it is a Danish ship sailing under American colors), and the crew managed to take the ship back. I thought this would finally lead to some progress, or at least make the pirates think twice before they attacked another American ship, but now the American navy is turning the situation into a giant cluster fuck.
An interesting side note: The first American war ship on the scene was the destroyer USS Bainbridge. It was named after William Bainbridge, the US navy Commodore who delivered a million dollars in tribute to the Dey of Algiers in 1800 to bribe the Barbary pirates from raiding American merchant ships.
The Bainbridge should have put boats in the water, and ordered the pirates to surrender immediately. If the pirates threatened their hostage they should have been told that if they harmed him they would be killed in the most horrible and painful way possible, and then their remains would be sewn into dead pigs and buried.
Now the pirates have had time to think, and they are turning the situation to their advantage. It has also given pirates ashore time to react, and they are coming to the rescue of their brethrens with other captured ships. These captured ships apparently has hostages aboard, and the hostages will be used to intimidate the American forces into any demands they care to make.
Where is John Wayne when you need him?
Gråulf
There is a resolution (1838, passed in October) which authorizes the use of "necessary means", meaning force if need be, to stop piracy in international waters. There is also another resolution (1816), which allows anti-pirate operations within Somali waters, but only with the agreement of the Somali transitional government.
But even all these operations have to be conducted within international law, defined in this case as the provisions of the UN Law of the Sea Convention.
There has also been a legal opinion by the Foreign Office in London that captured pirates cannot necessarily be sent back to whatever authorities can be found in Somalia, in case they are subject to harsh treatment. That would contravene the British Human Rights Act. The pirates captured in the Royal Navy action have now been handed over not to Somalia, but Kenya.
The Law of the Sea Convention places limitations on daring action. Under Article 110 of the convention a warship has first to send an officer-led party to board a suspected pirate ship to verify any suspicions.
The warship cannot just open fire. Any inspection has to be carried out "with all possible consideration". That sounds rather tentative, and totally nuts. I also understand that you cannot under international law convert a commercial ship into a kind of warship. The issue of who will put pirates in trial is a legal minefield as well, and has yet to be resolved. I suggested to my daughter Dana that this should be something the International Court should take up, but she tells me that the court only has jurisdiction over signatory countries, and no jurisdiction over anything happening in international waters. So, what the Hell good is it.
Now the pirates have captured an American ship (actually it is a Danish ship sailing under American colors), and the crew managed to take the ship back. I thought this would finally lead to some progress, or at least make the pirates think twice before they attacked another American ship, but now the American navy is turning the situation into a giant cluster fuck.
An interesting side note: The first American war ship on the scene was the destroyer USS Bainbridge. It was named after William Bainbridge, the US navy Commodore who delivered a million dollars in tribute to the Dey of Algiers in 1800 to bribe the Barbary pirates from raiding American merchant ships.
The Bainbridge should have put boats in the water, and ordered the pirates to surrender immediately. If the pirates threatened their hostage they should have been told that if they harmed him they would be killed in the most horrible and painful way possible, and then their remains would be sewn into dead pigs and buried.
Now the pirates have had time to think, and they are turning the situation to their advantage. It has also given pirates ashore time to react, and they are coming to the rescue of their brethrens with other captured ships. These captured ships apparently has hostages aboard, and the hostages will be used to intimidate the American forces into any demands they care to make.
Where is John Wayne when you need him?
Gråulf
Wednesday, April 8, 2009
Technology - Part 2 (By Moses)
Nano-Second/Sub-Zero Nano Second
A nanosecond is roughly .000000000001
A sub-zero nanosecond is roughly -.000000000001 (negative energy, negative time, stored information).
Sub-zero nanoseconds are used in sample and hold circuits to filter the trigger. This makes the digital reproduction seem real. It is used in only the professional audio and graphic software applications. Theoretically, this technology could be used to create temperal and spacial changes in our environment.
Internet versus Intranet
An intranet is the connection of hard-drives (cards) to a main source (hub). An internet connects these cards to a hub and to each-other.
The first intranet was developed in WW11 when the Americans and the British had their submarines’ hard-drives (cards) connected to one central hub. The submarines could only communicate with the central hub. This was essentially an internet as the central hub could share information discriminately. Location was usually shared from the central hub. In this configuration the hub itself was designed as a software application. This is what we see today in the form of search engines. This in its purist form is’ Networking. For example - E-Mail, My Space, Face-book, U Tube any software application for that matter.
Four properties of an internet;
1) Sharing information
2) Trading information,
3) Securing information
4) Selling information.
Gates and Limiters (Nyquist Limit)
Gates – in the analog domain, a gate is considered intolerance. The paradigm relationship is considered expansion. In other words 1:2, 1:4, 1:16, 1:256 and 1: 65536 are exponential rates of expansion and is considered to be tolerance (You’re tolerant up to a point). Past its’ level of barometer it becomes too fast a rate and is discriminated upon. In the digital domain, a gate is considered stringency (defined). Expansion in this context is considered a variance of concept (triggers). A Gate in its’ purest form takes a frequency barometer for example 20Hertz and any frequency above is mainstreamed and any frequency below is cut. That is why some machines use high-pass filters and some machines use low-cut filters. To insure that the signal is clean a 12 Hertz guard-band is used, and this increases the headroom of the signal (as the barometer of the Gate is set at 12 Hertz, therefore the peak frequency increases allowing for the signal to be recorded hotter. i.e. 10 dbu as opposed to 6 dbu).
Limiters – in the analog domain a limiter is considered intolerance. The paradigm relationship is considered compression. In other words 2:1, 4:1, 16:1, 256:1, 65536:1 is an exponential rate of compression and is considered to be tolerance. Past its’ level of barometer it becomes too much pressure and is discriminated upon. In the digital domain, a limiter is considered stringency (defined). Compression in this context is considered a variance of concept (triggers). A limiter in its’ purest form takes a frequency barometer, for example, 20khz and any frequency below is mainstreamed, and any frequency above is cut. That is why some machines use low-pass filters and some machines use high-cut filters. When it comes to limiters and compressors the filter carries a property that is called guard-band, firewalls and in CD professional applications a 2 kilohertz guard-band is used to prevent distortion (also known as unwanted noise). This is a Nyquist principle and creates headroom. In Digital Audio Tape there is no guard-band to protect the print and this is why an audio engineer uses a limiter so as to not clip (anything past 0dbu and the highest frequency recorded is less than 20khz enabling the signal appear to have more depth – A.K.A more headroom. There is more headroom with D.A.T tape than Compact disks as Compact disks incorporate a 2khz guard-band to print onto disk without distortion (as even with a built in limiter the print is only capable of printing frequencies up to 20.05khz and this is a full 3.95khz less in sampling than D.A.T. In the Nyquist Limit all frequencies positive and negative are accounted for and this is why the maximum frequency is doubled (in magnetics this is considered positive and negative ions). In other words through the Nyquist limit we see 20.05 kilohertz + 2kilohertz guard-band = 22.05 kilohertz x 2 = 44.1 kilohertz (C.D). D.A.T takes 24 kilohertz and doubles it to follow the Nyquist principle of positive and negative waveforms that encompasses all spectrums of sound.
Tape Speed
7.5 inches/second maximizes domains, meaning that there is more domains to work with. In professional applications in the analog domain 2-inch tape is used and in the digital domain Super-8 tape is used. In amateur and semi-professional applications tape speed is sped up to 30 inches/second and is used to accelerate rate. By using less tape and maximizing domains the human brain has a greater chance of interpreting the data effectively, also known as optimum (vs. Peak). 4-track on cassette is one side only as the left channel is in stereo, as is the right channel. This is also known as surround sound, or THX – a Lucas design (commonly referred to as DTS – Digital Theatre Systems).
Broadcast Networks
From a source (camera and/or microphone through a cable and/or antennae) to satellites- A.K.A monitors.
Miramax – E.T
M.G.M (Metro-Goldwyn-Mayer) – James Bond
Warner Brothers – A Clockwork Orange
Disney – The Jungle Book
Columbia (C.B.S, A.B.C, N.B.C included) – DaVinci Code
Alliance – Johnny Mnemonic
Fox
T.B.S – The Atlanta Braves
C.B.C – Hockey Night in Canada
C.T.V – The Toronto Blue Jays
B.B.C – Monty Python
Sony – Amanda Marshall
Paramount – The Sting
Orion
Dream-Works
Nelson/Embassy
Universal
MCA
In Video we see test tape on the feed also known as propagation (or influence-excise). There is frequency bias (the colorization of sound), amplitude bias (volume), panning bias (left and right channels), quadrant bias (where the focus of the screen appears), reverb or contrast (the depth of the signal), the pitch and pixel count (resolution), the timbre (brightness or tone, the equalization), and leagues (the overall perception broadcast and received- A.K.A- Sonar). To affect the equalization of the signal logarithms-synthesis is used A.K.A log. I.E – 1 cent of 100 is .01 or log10, 1 cent of log10 (.01) is .001 (log20), I cent of log20 is .0001 (log30) etc. This provides us with seamless edits and makes the picture pleasing to the eye
A nanosecond is roughly .000000000001
A sub-zero nanosecond is roughly -.000000000001 (negative energy, negative time, stored information).
Sub-zero nanoseconds are used in sample and hold circuits to filter the trigger. This makes the digital reproduction seem real. It is used in only the professional audio and graphic software applications. Theoretically, this technology could be used to create temperal and spacial changes in our environment.
Internet versus Intranet
An intranet is the connection of hard-drives (cards) to a main source (hub). An internet connects these cards to a hub and to each-other.
The first intranet was developed in WW11 when the Americans and the British had their submarines’ hard-drives (cards) connected to one central hub. The submarines could only communicate with the central hub. This was essentially an internet as the central hub could share information discriminately. Location was usually shared from the central hub. In this configuration the hub itself was designed as a software application. This is what we see today in the form of search engines. This in its purist form is’ Networking. For example - E-Mail, My Space, Face-book, U Tube any software application for that matter.
Four properties of an internet;
1) Sharing information
2) Trading information,
3) Securing information
4) Selling information.
Gates and Limiters (Nyquist Limit)
Gates – in the analog domain, a gate is considered intolerance. The paradigm relationship is considered expansion. In other words 1:2, 1:4, 1:16, 1:256 and 1: 65536 are exponential rates of expansion and is considered to be tolerance (You’re tolerant up to a point). Past its’ level of barometer it becomes too fast a rate and is discriminated upon. In the digital domain, a gate is considered stringency (defined). Expansion in this context is considered a variance of concept (triggers). A Gate in its’ purest form takes a frequency barometer for example 20Hertz and any frequency above is mainstreamed and any frequency below is cut. That is why some machines use high-pass filters and some machines use low-cut filters. To insure that the signal is clean a 12 Hertz guard-band is used, and this increases the headroom of the signal (as the barometer of the Gate is set at 12 Hertz, therefore the peak frequency increases allowing for the signal to be recorded hotter. i.e. 10 dbu as opposed to 6 dbu).
Limiters – in the analog domain a limiter is considered intolerance. The paradigm relationship is considered compression. In other words 2:1, 4:1, 16:1, 256:1, 65536:1 is an exponential rate of compression and is considered to be tolerance. Past its’ level of barometer it becomes too much pressure and is discriminated upon. In the digital domain, a limiter is considered stringency (defined). Compression in this context is considered a variance of concept (triggers). A limiter in its’ purest form takes a frequency barometer, for example, 20khz and any frequency below is mainstreamed, and any frequency above is cut. That is why some machines use low-pass filters and some machines use high-cut filters. When it comes to limiters and compressors the filter carries a property that is called guard-band, firewalls and in CD professional applications a 2 kilohertz guard-band is used to prevent distortion (also known as unwanted noise). This is a Nyquist principle and creates headroom. In Digital Audio Tape there is no guard-band to protect the print and this is why an audio engineer uses a limiter so as to not clip (anything past 0dbu and the highest frequency recorded is less than 20khz enabling the signal appear to have more depth – A.K.A more headroom. There is more headroom with D.A.T tape than Compact disks as Compact disks incorporate a 2khz guard-band to print onto disk without distortion (as even with a built in limiter the print is only capable of printing frequencies up to 20.05khz and this is a full 3.95khz less in sampling than D.A.T. In the Nyquist Limit all frequencies positive and negative are accounted for and this is why the maximum frequency is doubled (in magnetics this is considered positive and negative ions). In other words through the Nyquist limit we see 20.05 kilohertz + 2kilohertz guard-band = 22.05 kilohertz x 2 = 44.1 kilohertz (C.D). D.A.T takes 24 kilohertz and doubles it to follow the Nyquist principle of positive and negative waveforms that encompasses all spectrums of sound.
Tape Speed
7.5 inches/second maximizes domains, meaning that there is more domains to work with. In professional applications in the analog domain 2-inch tape is used and in the digital domain Super-8 tape is used. In amateur and semi-professional applications tape speed is sped up to 30 inches/second and is used to accelerate rate. By using less tape and maximizing domains the human brain has a greater chance of interpreting the data effectively, also known as optimum (vs. Peak). 4-track on cassette is one side only as the left channel is in stereo, as is the right channel. This is also known as surround sound, or THX – a Lucas design (commonly referred to as DTS – Digital Theatre Systems).
Broadcast Networks
From a source (camera and/or microphone through a cable and/or antennae) to satellites- A.K.A monitors.
Miramax – E.T
M.G.M (Metro-Goldwyn-Mayer) – James Bond
Warner Brothers – A Clockwork Orange
Disney – The Jungle Book
Columbia (C.B.S, A.B.C, N.B.C included) – DaVinci Code
Alliance – Johnny Mnemonic
Fox
T.B.S – The Atlanta Braves
C.B.C – Hockey Night in Canada
C.T.V – The Toronto Blue Jays
B.B.C – Monty Python
Sony – Amanda Marshall
Paramount – The Sting
Orion
Dream-Works
Nelson/Embassy
Universal
MCA
In Video we see test tape on the feed also known as propagation (or influence-excise). There is frequency bias (the colorization of sound), amplitude bias (volume), panning bias (left and right channels), quadrant bias (where the focus of the screen appears), reverb or contrast (the depth of the signal), the pitch and pixel count (resolution), the timbre (brightness or tone, the equalization), and leagues (the overall perception broadcast and received- A.K.A- Sonar). To affect the equalization of the signal logarithms-synthesis is used A.K.A log. I.E – 1 cent of 100 is .01 or log10, 1 cent of log10 (.01) is .001 (log20), I cent of log20 is .0001 (log30) etc. This provides us with seamless edits and makes the picture pleasing to the eye
Wednesday, April 1, 2009
Technology - A Five Part Series - By Moses
Technology
Computers – i) Apple ii) Internal Business Machines (I.B.M) iii) Hewlett Packard iv) Olivetti v) N.C.R (Bank Machines – Wide Area Network).
Microphones- i) Ribbon ii) Dynamic iii) Condenser – Polar Patterns in reference to all three.
Camera’s – i) Analog ii) Digital
Automotives - refer to article in Pop-Culture
Planes – i) Boeing 747, 767 ii) Mirage iii) F-16/17/18 iv) C-16 v) Concorde vi) Cesna vii) MIG
Trains – i) Bombardier ii)Go-Train iii) C.N Rail (Canadian National) iv) Via Rail v) Amtrak
Radar – Tracking – Satellites
Cable – i) Ground (Copper) ii) Fiber-optics-Satellite iii) Standard frequency – i.e. 900Mega-Hertz – Cellular.
The internet vs. singular workstations
Fuel – i) Solar ii) Electric iii) Hydrogen iv) Wind v) Fossil vi) Nuclear
A/C – D/C – Alternating Current – Direct Current
A.M (Amplitude Modulation) and F.M (Frequency Modulation)
Source – Monitors (also known as an Input / Output Module) – a channel
Electronic Principles and Digital Circuitry A.K.A Silicon
Computers
There are 9 characters in a byte. One byte sequence is binary, and follows the sequence 1, 2, 4, 8, etc. This is called pulse code modulation, and other pulse codes such as pulse width, pulse number, and pulse positional are considered to be modulation or synthesis – osmosis (change).
In binary the sequence is as follows; 1 is on, 0 is off
The least significant bit is the bit to the furthest right. i.e. – 10100(1- least significant bit)
The most significant bit is the bit to the furthest left. i.e. – (1)01001
The binary word is a list of on and offs. i.e. - 101100101 – the bit rate is the number of bits in the binary word.
To calculate the value of the binary word simply, from right to left, add each bit together. The sequence of a binary word is as follows; 1, 2, 4, 8, 16, 32, 64…
To calculate; 101001101 – bit rate is 9
256+64+8+4+1 = 333 x 9
Therefore the binary word has 2997 characters
Computers and Communication
An exchange of information from one data bank to another whereupon a data bank can be a disk, a hard-drive, a software application, or random access memory. A disk stores information. With a disk you put it into a drive and the hard-drive (random operating memory) stores it. The hard-drive then retrieves this information by using its’ random access memory. The disk in this configuration is the master device.
A hard-drive uses both R.O.M, and R.A.M. Operating memory is information stored and access memory is retrieved information.
Software applications are platforms used to store and retrieve information. Essentially, the concept of software includes hardware, as in order to access any information from a computer you’re retrieving this information from your R.O.M (which is a software application – internal to the hard-drive). Each independent digital workstation is a search engine. To copyright it makes it a legal commodity. Software applications access these stations through a Local Area Network and to a Wide Area Network, the intranet in the Private sector and the internet in the Public sector. For a company to burn D.V.D’s (Digital Video Encoding Drives) for the masses outside copyright means that they have the right to affect the storage medium, test frequencies, also known as propagate – influence.
Video Games
In 1971 home computers were introduced at a 2-bit rate, 3 kilobytes, is 27,000 characters and the video game Pong was released. Today, when two-bit is used the least significant bit is the indicator as to whether the machine is on or off at 1 and 0 (a switch), and the most significant bit is the total number of characters, undefined by specification.
In 1975, the technology changes towards a format in 4-bit, 15 kilobytes, is 135,000 characters and we saw Atari – Space Invaders and Galaxian, video games that were put in Arcades for the Public. These video games taught intolerance towards aliens, anyone outside their ways. People that played the game in an arcade had to stand as the game was designed for this purpose.
In 1977, Atari, Commodore, Apple, I.B.M, Coleco-Vision introduced 7-bit, also known as 64K, or 127 kilobytes, is 1,143,000 characters. They were Home Systems using conceptual video games such as Donkey Kong and Zapper designed to teach a person to fight paranoia (while sitting on a couch).
In 1983, Arcades introduced the Video Game Dragon’s Lair. This was the first virtual reality Video Game on the Market in the Arcades. It promoted the idealism of making choices. If you made the wrong choice you died. Virtual Reality is reality – A persons’ vision.
In 1987, Nintendo and Sega Home Systems surfaced and the number of conceptual video games are now so ever-changing that one must question what the games are teaching regarding the education of our children.
Significant analogies regarding the properties of Video Games;
1. Game Over – Death
2. Score – Digital representation of the analog domain, the player him/her self
3. Killing – De-sensitization
4. Arcades – Community Hard-drive (Hub)
5. Home Systems – Hubs and interface cards (Cartridges and System-internal software)
6. Software (interface cards) – Hard-drive (circuitry) - Grid
Synopsis
By 1996 the professional application used in post-production technology was 16-bit, or 65,535,000 kilobytes (65,535 megabytes) is 294,913,359,000,000 characters. It operated at 2 gigahertz.
In professional applications today, as of Jan 16th/08, an I-Pod uses 80 Gigabytes of R.O.M at 2 gigahertz, and post- production facilities use 384-bit at 20 gigahertz, a lot of information to account for, at an enormous speed. In 1998, the professional application of Ram was 4 gigabytes, used by Apple, in the name of the Seagate Barracuda (operating at 2 Gigahertz).
In my opinion the professional standard application should be 24-bit, or 16,777,215,000 Kilobytes (16.777215 terabytes), or 16,777,215,000,000 bytes, is 150,994,935,000,000,000 characters (operating at 4 gigahertz). My logic is this; In order for a man or woman to account for more characters than this amount, it will be such an ‘information overload’ that their system will turn to paranoia. In modern day home computers they’re starting to use terabytes. A terabyte is 1 trillion bytes, or, 1,000,000,000,000 bytes or 9,000,000,000,000 characters.
Computers – i) Apple ii) Internal Business Machines (I.B.M) iii) Hewlett Packard iv) Olivetti v) N.C.R (Bank Machines – Wide Area Network).
Microphones- i) Ribbon ii) Dynamic iii) Condenser – Polar Patterns in reference to all three.
Camera’s – i) Analog ii) Digital
Automotives - refer to article in Pop-Culture
Planes – i) Boeing 747, 767 ii) Mirage iii) F-16/17/18 iv) C-16 v) Concorde vi) Cesna vii) MIG
Trains – i) Bombardier ii)Go-Train iii) C.N Rail (Canadian National) iv) Via Rail v) Amtrak
Radar – Tracking – Satellites
Cable – i) Ground (Copper) ii) Fiber-optics-Satellite iii) Standard frequency – i.e. 900Mega-Hertz – Cellular.
The internet vs. singular workstations
Fuel – i) Solar ii) Electric iii) Hydrogen iv) Wind v) Fossil vi) Nuclear
A/C – D/C – Alternating Current – Direct Current
A.M (Amplitude Modulation) and F.M (Frequency Modulation)
Source – Monitors (also known as an Input / Output Module) – a channel
Electronic Principles and Digital Circuitry A.K.A Silicon
Computers
There are 9 characters in a byte. One byte sequence is binary, and follows the sequence 1, 2, 4, 8, etc. This is called pulse code modulation, and other pulse codes such as pulse width, pulse number, and pulse positional are considered to be modulation or synthesis – osmosis (change).
In binary the sequence is as follows; 1 is on, 0 is off
The least significant bit is the bit to the furthest right. i.e. – 10100(1- least significant bit)
The most significant bit is the bit to the furthest left. i.e. – (1)01001
The binary word is a list of on and offs. i.e. - 101100101 – the bit rate is the number of bits in the binary word.
To calculate the value of the binary word simply, from right to left, add each bit together. The sequence of a binary word is as follows; 1, 2, 4, 8, 16, 32, 64…
To calculate; 101001101 – bit rate is 9
256+64+8+4+1 = 333 x 9
Therefore the binary word has 2997 characters
Computers and Communication
An exchange of information from one data bank to another whereupon a data bank can be a disk, a hard-drive, a software application, or random access memory. A disk stores information. With a disk you put it into a drive and the hard-drive (random operating memory) stores it. The hard-drive then retrieves this information by using its’ random access memory. The disk in this configuration is the master device.
A hard-drive uses both R.O.M, and R.A.M. Operating memory is information stored and access memory is retrieved information.
Software applications are platforms used to store and retrieve information. Essentially, the concept of software includes hardware, as in order to access any information from a computer you’re retrieving this information from your R.O.M (which is a software application – internal to the hard-drive). Each independent digital workstation is a search engine. To copyright it makes it a legal commodity. Software applications access these stations through a Local Area Network and to a Wide Area Network, the intranet in the Private sector and the internet in the Public sector. For a company to burn D.V.D’s (Digital Video Encoding Drives) for the masses outside copyright means that they have the right to affect the storage medium, test frequencies, also known as propagate – influence.
Video Games
In 1971 home computers were introduced at a 2-bit rate, 3 kilobytes, is 27,000 characters and the video game Pong was released. Today, when two-bit is used the least significant bit is the indicator as to whether the machine is on or off at 1 and 0 (a switch), and the most significant bit is the total number of characters, undefined by specification.
In 1975, the technology changes towards a format in 4-bit, 15 kilobytes, is 135,000 characters and we saw Atari – Space Invaders and Galaxian, video games that were put in Arcades for the Public. These video games taught intolerance towards aliens, anyone outside their ways. People that played the game in an arcade had to stand as the game was designed for this purpose.
In 1977, Atari, Commodore, Apple, I.B.M, Coleco-Vision introduced 7-bit, also known as 64K, or 127 kilobytes, is 1,143,000 characters. They were Home Systems using conceptual video games such as Donkey Kong and Zapper designed to teach a person to fight paranoia (while sitting on a couch).
In 1983, Arcades introduced the Video Game Dragon’s Lair. This was the first virtual reality Video Game on the Market in the Arcades. It promoted the idealism of making choices. If you made the wrong choice you died. Virtual Reality is reality – A persons’ vision.
In 1987, Nintendo and Sega Home Systems surfaced and the number of conceptual video games are now so ever-changing that one must question what the games are teaching regarding the education of our children.
Significant analogies regarding the properties of Video Games;
1. Game Over – Death
2. Score – Digital representation of the analog domain, the player him/her self
3. Killing – De-sensitization
4. Arcades – Community Hard-drive (Hub)
5. Home Systems – Hubs and interface cards (Cartridges and System-internal software)
6. Software (interface cards) – Hard-drive (circuitry) - Grid
Synopsis
By 1996 the professional application used in post-production technology was 16-bit, or 65,535,000 kilobytes (65,535 megabytes) is 294,913,359,000,000 characters. It operated at 2 gigahertz.
In professional applications today, as of Jan 16th/08, an I-Pod uses 80 Gigabytes of R.O.M at 2 gigahertz, and post- production facilities use 384-bit at 20 gigahertz, a lot of information to account for, at an enormous speed. In 1998, the professional application of Ram was 4 gigabytes, used by Apple, in the name of the Seagate Barracuda (operating at 2 Gigahertz).
In my opinion the professional standard application should be 24-bit, or 16,777,215,000 Kilobytes (16.777215 terabytes), or 16,777,215,000,000 bytes, is 150,994,935,000,000,000 characters (operating at 4 gigahertz). My logic is this; In order for a man or woman to account for more characters than this amount, it will be such an ‘information overload’ that their system will turn to paranoia. In modern day home computers they’re starting to use terabytes. A terabyte is 1 trillion bytes, or, 1,000,000,000,000 bytes or 9,000,000,000,000 characters.
Subscribe to:
Posts (Atom)