Microchip is alien technology

Discussion in 'Laptop Hardware' started by sparkster, Jun 29, 2016.

  1. sparkster

    sparkster

    Joined:
    Aug 15, 2015
    Messages:
    139
    Likes Received:
    26
    Microchip technology is something which has plagued me for over 20 years, ever since I was a young child. Even back then there was something about the microchip which just didn't sit right with me and I spent much of my life researching the history and development of the integrated circuit. As a child, I felt intuitively that the microchip was an off-world technology and although I initially grew up to become more sceptical, there is now much more information available and I have done a lot more research and have become convinced that microchip technology came from UFOs. Strangely enough, despite my intuitive feeling as a child, I didn't know at the time that many military men, such as Lt Colonel Corso, would come forward in the future to claim that the microchip was indeed reversed engineered from UFO technology. There is now plenty of military witness testimony to confirm this and I have found many holes in the officially documented history and development of the microchip which makes it all extremely questionable. Even the people themselves who were involved can't seem to get their story straight and oddly enough, none of them exactly how a microchip worked, only that it did. Even more peculiarly, I went on to become a UFO experiencer myself.
     
    sparkster, Jun 29, 2016
    #1
    IBMPC8088 and nytegeek like this.
    1. Advertisements

  2. sparkster

    Corzhens

    Joined:
    Feb 19, 2016
    Messages:
    429
    Likes Received:
    87
    Location:
    Philippines
    Although this is out of science already but I think it is interesting to note of your observations. I believe that inventions that benefit mankind is a gift from God and the design is provided via a dream or an inspiration. With the microchip technology, it was developed in the mid 20th century to counter the limitation of the memory capacity of computers. And as we all know, the microchip has grown by leaps and bounds from the measure of kilo bytes, we are now in the terabyte count.
     
    Corzhens, Jun 29, 2016
    #2
    IBMPC8088 likes this.
    1. Advertisements

  3. sparkster

    IcyBC

    Joined:
    Jul 12, 2015
    Messages:
    756
    Likes Received:
    116
    I am not sure how I feel about this either! At the vet, they want me to consider putting a microchip into my dog, but ehh, I am not going for it! I can't imagine that one day we are required to have microchip in our body! I hope by then I am gone already :(
     
    IcyBC, Jul 1, 2016
    #3
    IBMPC8088 likes this.
  4. sparkster

    nytegeek

    Joined:
    Sep 13, 2015
    Messages:
    327
    Likes Received:
    56
    What a ridiculous notion. If you still think this, you need to study the history again because you clearly are missing something or just didn't understand what you studied. Computing moved from mechanical devices to electronic devices out of need during world war 2. The micro chip came later, but the first ones were of very simple design and it was very believable that man came up with them without outside influence. Noticing that systems got faster when circuits were laid closer together on a chip was also a natural progression, no aliens needed.

    There is a difference between having an open mind and believing ridiculous things that fly in the face of known facts. There is no conspiracy or aliens here. Give it a rest.
     
    nytegeek, Jul 1, 2016
    #4
    IBMPC8088 likes this.
  5. sparkster

    IBMPC8088

    Joined:
    Feb 1, 2016
    Messages:
    371
    Likes Received:
    145
    Not to credit or discredit either stance, but to explain some details may be helpful and in order.

    Originally the design for the logic gates (and, or, not, etc) used by computer systems today and the flip-flops were built from vacuum tubes.

    Prior to that, you had player pianos and mechanical devices which were the predecessor to the vacuum tubes and (later) clock based circuits.

    Rather than using mechanical timing, new systems started to use digital signals and charges as a clock and a constant rather than the motor (although the mechanical parts stuck around for a long time with the use of drums, punch cards, and magnetic drives even when neat gear-based machines and player pianos became history).

    Before the integrated circuitry of today and VLSI, you had transistors between that, which were able to perform the functions of the vacuum tubes but in a fraction of the space.

    Basically the transfer of the charges to make those circuits possible could be done with metallic oxide semicounductors that were more dependable, and were able to transfer charges quickly and more efficiently. You have 3 poles on a transistor, and wherever the charge builds up the most creates the equivalent of a digital 1, whereas if it isn't built up enough, it's a 0. Any area between the threshold is usually considered as "off" past a point, or "on" past the other point, even though technically it is an ADC (Analog to Digital) conversion to make it a digital circuit in form.

    Throughout the early 60's up to the early 1980's, transistors were around for nearly 2 decades before integrated circuitry. Some people still use them today and build computers from them. TTL circuits and 555 timers are still pretty popular.

    The integrated circuits you see today are the equivalent of the vacuum tubes and transistors only in a microscopic form. More accurately, microlithographic (hence, microlithography). They realized they could get the same results from transistors by using chemicals and films that held and transfered charges in layers rather than using bulky transitors or absolutely huge vacuum tubes.

    They still designed the circuits as large as they did with transistors in the 60's and 70's, but would then shrink them down and use PCBs as the host for the microlithographic circuits they created. That cut the power consumption and cost of development significantly once mass produced, and made transfer of signals between previous bottleneck areas disappear a lot to bring speeds from Hertz (HZ) up to Megahertz (MHZ), but increased heat from the smaller pathways in such compact spaces and made it to where you (eventually) needed cooling when they continued to shrink the designs further.

    Today, we've moved past even microlithography from the 80's and 90's, and are using ultraviolet lithography and other techniques that are several factors smaller than microlithography even, but at the expense of making a lot more heat and requiring significant cooling to prevent errors and overheating.

    Whether or not the remains of the Roswell alien crash that are now being stored at the Dugway Proving Ground in Utah had anything to do with insight to more advanced techniques that are being used today and tomorrow, it should be understood that all of the digital and analog logic circuits in use today were always able to be made from transistors and vacuum tubes (albeit requiring vast amounts of space and power) with any of the other means available.

    The foundation for analog and digital computing existed prior to any recorded alien interactivity in the late 40's and early 50's.

    Advances to it may have been humanity's own, or it may have been from uncharted territories...but the traditional technology itself was done right here on this planet by people with great ideas and a desire to see it through. :)
     
    IBMPC8088, Jul 1, 2016
    #5
    IcyBC likes this.
  6. sparkster

    sparkster

    Joined:
    Aug 15, 2015
    Messages:
    139
    Likes Received:
    26
    Well, let's see now - first of all, believing it to be a ridiculous notion in the first place is actually what's ridiculous because this notion is not ridiculous at all. Known facts are not known facts - you have just been taught that they are and you are blindly accepting it without really looking any deeper. Telling me to give it a rest is just plain out of order. No, I won't give it a rest - not ever and neither will the thousands of people in government and military all over the world who are now speaking out openly about this and who have the evidence, including people who were directly involved in these developments (including some of the world's top scientists - are you saying you know more and are more clever than they are? Then why aren't you working with them?).

    There's nothing ridiculous about it and believing that there is, is nothing but proof of societal conditioning on your part and allowing yourself to be told what to believe by authority instead of thinking for yourself. You mentioned World War II. Know anything about Operation Paperclip? That was an operation to get a load of top Nazi scientists into the US after World War II in order to work on developing the very technology you speak of - the Germans had already been working on reverse-engineering crashed UFOs. Some of the people brought into the US after WWII as part of Paperclip included Werner Von Braun who went on to work on the Apollo missions (space travel, hint hint - the microchip was first used in the Minuteman missile in 1962) and who also had access to the Roswell crash site. Hermann Oberth was another. Both those men claim that the technological development was helped along by otherworldly sources and both those men were on Corso's research and development team. All of the documentation is there but this is not the sort of information which is taught in schools and colleges. Ask yourself, why not?

    Believable that man came up with them? So, how is it that the Robert Noyce, the very person credited for inventing the integrated circuit, claimed to have handled one ten years before he'd even invented it? Oops, big slip up on his part there and that's certainly not the only one - the guy who invented the microchip, if you ask him, actually hasn't got a clue how it works. I perfectly understand what I've studied and it's pretty clear from the way you've worded your response that you yourself have very little clue about what I'm talking about here. Additionally, there is not only plenty of declassified government documentation which backs up my claims but even all of the world's top scientists, researchers and developers who have been working on these technologies have all spoken out openly about this, including people like the former CEO of Lockheed, Ben Rich.

    Neither did aliens/UFOs openly appear in the 1940's. They were around a lot longer than that and have always been around and if you actually do your historical research in-depth, as I have, you will find that there were plenty of electrical devices present on Earth well before man had even learned how to harness electricity. Additionally, if you did any significant amount of research into the UFO phenomenon you would find that there are in fact ways and means of witnessing them and finding out the truth for yourself and nearly all the great scientists throughout history knew this and comunicated with them but people have become so conditioned, so boxed in and so influenced and manipulated by society that they have been cut off from the fundamental truth.

    You can either believe what the history books tells you with blind faith and subject yourself to government conditioning or you can do the real research and find the truth - just because it doesn't fit your paradigm doesn't mean it cannot be true. Ever heard of cognitive dissonance? I suggest you learn about how psychology, psychiatry, the mind and human perception works and how people are being controlled and conditioned on an ongoing basis.

    Didn't you ever wonder why it was that we had the technology to drop an atomic bomb on Hiroshima by the 1940's and yet we didn't even have calculators until the 1960's. But the microchip is nothing compared to keratin fibers which are grown through synthetic nanobiotechnology that integrates with genetic scalar waves and the morphogenic field of humans and which act as fiber optics, allowing not only for targeted surveillance but also to influence human DNA and consciousness. Yet the microchip/integrated circuit is a precursor to this nanotechnology. Does that sound ridiculous too? Then you definitely haven't done your research. There are even scientific papers out there published in peer reviewed journals already which prove that this technology already exists and is in use. It's a biological self-replicating nanotechnology which is an artificial intelligence
     
    Last edited: Jul 5, 2016
    sparkster, Jul 5, 2016
    #6
  7. sparkster

    nytegeek

    Joined:
    Sep 13, 2015
    Messages:
    327
    Likes Received:
    56
    You have confused an open mind for one that allows trash in. You are looking for conspiracies where there are none. You are drawing lines between things that aren't connected or are only connected by a series of factors you conveniently leave out to make your argument sound better. I think you are full of it, and an overly verbose statement designed to give you the appearance of being correct isn't going to change that opinion. Have fun with your foil hat,
     
    nytegeek, Jul 5, 2016
    #7
  8. sparkster

    rz3300

    Joined:
    Feb 21, 2016
    Messages:
    224
    Likes Received:
    21
    Well the fact is is that whether or not you think it is okay or if you think that it is going too far, it is happening and is going to happen so you might as well just get used to it. I do not think that it is going to be some Brave New World or Harrison Bergeron type scenario where we are just robots, but when you think of the healthcare industry, these types of technologies can help track diseases and aid doctors in finding problems within the body. I am betting that we will see more and more of this as time goes on.
     
    rz3300, Jul 7, 2016
    #8
  9. sparkster

    nytegeek

    Joined:
    Sep 13, 2015
    Messages:
    327
    Likes Received:
    56
    The technology is here, it advances rapidly, and some people just don't have the intelligence to understand how it came to be so they start looking for conspiracy theories and tall tales to explain it. At any rate you can't put the genie back in the bottle. If you think the microchip itself was too advanced for humans to develop you are going to be mystified by whats coming. I guess if it makes somebody feel better to believe it was aliens when they can't grasp the subject matter we should just let them.
     
    nytegeek, Jul 7, 2016
    #9
    1. Advertisements

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.