Sunday, March 26, 2017

3D Xpoint, Welcome to the Future

By Xavier Reinders

Lets get this started with a little precursor, there are currently 2 basic types of memory containers utilized by computer systems. First, you have your RAM, random access memory, which is used as a temporary way to process data to help smoothly operate memory intensive programs like video games or complex computation software. It can only store information when the system has power, it is the fastest kind of memory but it is not practical for long term storage due to its pricing and power requirements. That is were hard drives come in, as the second type of memory containter they are massive, storage wise, and are able to hold information even without power. These come in two forms HDDs, typically a lot of space but slower, and SSDs, less space(for the same price) but a lot faster.

A mock up of Intel and Microns new 3D Xpoint
So who cares about 3D Xpoint (cross point) when we already have these perfectly fine forms of storage. Lets compare the numbers, RAM operates at latency time measured in nanoseconds, SSDs operate at latency time measured in microseconds (1000 nanoseconds) and HDDs operate at latency time measured in milliseconds (1000000 nanoseconds). So were does the Xpoint fall, supposedly around the 10 nanoseconds mark, so its a touch slower than RAM but it leaps ahead in other areas.

Like did I mention it was also a form of massive memory storage, although a different thing from HDDs and SSDs. So it has a speed approaching that of RAM with the storage capabilities of a hard drive. Plus a couple extra hard drive like bonuses, it will be cheap and it won't need power to keep things stored. The Xpoint is expected to come in above SSD's in the price per amount of storage department but well below RAM, meaning if you need something of medium size as fast as possible XPoint is the way to go. The fact that it won't need power to keep its ones and zeros also makes it more viable than typical RAM as it will be able to store important things across power cycles (shutdowns and restarts).

So it's the future right? Well not for all of us, it is not expected to replace either RAM or hard drives but rather to work along side them, helping in areas where the other 2 fall short. It is not however expected to make a huge incursion into everyday computing. It will more be seen in the two places all high end computer parts go, gaming and big data. Those little advantages that Xpoint holds will be magnified 1000 times when put on the scale of such massive companies as Google and they will help them achieve
their goal of world domination.

How Graphic Is Revolutionizing The Movie Industry




When most people think of the computer science field, they just think of programming and computer scientists sitting and staring at code all day. In fact, its actually a much broader field than that and extends to a wide variety of different fields that many people would not think of. One of these fields that is included is the field of graphic design.



As movies become better and more complex, it takes a lot to make a god movie these days. Especially in an era where Disney movies are being remade to live action, so real life characters need to accurately portray their animated counterparts. This requires a lot of computer and editing skill that can only be preformed by the most skilled of computer professionals. They have helped to transform the movie industry into what it is today with movies such as The Jungle Book, Avatar and others. Movies from a long time ago had a much less developed method of editing and the graphic were incredibly inferior to those in modern times. It is thanks to computer scientists that we are able to have these improved methods of designing.

All The Resources You'll Need

Whether you are a beginner or an advanced programmer, there are always ways that you can sharpen your knowledge and skills, especially with high quantity and quality resources available.

You want to start but have no idea where to begin? The internet is your best friend.

Codecademy: A website dedicated to teach coding through interact videos and online tutorials with
no prior experience required. They are an education company committed to "building the best learning experience inside and out". Along with individual languages, Codecademy also provides resources for the classroom, including downloadable lesson plans for Primary and Secondary Computer Science curriculums and the ability to test and track the performance of their students.

r/learnprogramming: With over 300,000 users, LearnProgramming is perhaps Reddit's most popular coding community friendly to all experience levels.

For beginners, multiple FREE online courses can be found throughout the subreddit with languages such as Python, Java, and HTML. Have a homework or project question? Submit a post and many users are quick to respond. Interested in starting a specific project such as applications or websites? Their wiki has links to the language used along with tutorials on how to start.

For the intermediate level coders, the subreddit contains tips and tricks on how to improve in their wiki along with many practice exercises and project ideas.

And for the person looking for professional level work, they are also here to help to with information on how you can make money, get a job without a degree or how to break into the industry and get an entry level job.


Cracking the Coding Interview: Finally ready to tackle the real world but have no idea what to
expect? This is the perfect book, recommend by most professional programmers, that teaches you what you need to know, enabling you to perform at your best. When applying for a full time position, coding interviews are different than the typical resume / questions interview you would normally expect. Coding interviews are often a test to see if you are capable of solving real world problems. Many companies with throw short programming questions at you and expect quick solutions with written pseudocode. They already expect you to be well knowledged in many languages but can you use your skills in real life?

Prompt 6: Evaluating Sources:Computer Game Addiction

By Qingyang Li

Several years ago, the American Medical Association decided not to classify computer game addiction as a recognized psychological disorder comparable to gambling addiction.

However, arguments about whether or not to treat computer game addiction as a psychological disorder still continue.

Some psychologists, physicians and researchers strongly believe that computer game addiction is an impulse control problem (similar to an inability to control gambling habits ) and deserves similar recognition as a "real " disorder.

In contrast, there are also mental health professionals (and of course, members of the general public ) who believe that it is incorrect to assume that computer games themselves are addictive and that classification would trivialize serious problems like drug and alcohol addictions.

People who against making computer game addiction said : " Although some individuals may play computer games to avoid addressing other personal issues, this does not necessarily indicate " addiction", but rather a less than ideal problem solving approach to life challenges, stressors, and temptations.

And who agree for making computer game addiction said : " Numerous studies have found associations between computer game addiction and issues such as poor academic performance, work-related difficulties, marital dissatisfaction, poor health, and other psychological problems like anxiety and depression.

So in my personal opinion, as a college student and major in computer science, I didn't see any of my friends who are addicted in computer game or fail class because of playing games. Computer games is interesting that can attract people, and the addiction we talking about mostly will hurt people like taking drugs and too much alcohol, although there still some of people because of computer games did stupid thing ,but it's a way less than alcohol and drugs.

And people can get addicted for everything, is that means they are all harm? What if a person is addicting for study (It's true happened).

And yes, if somebody play computer games too much time, it can influence his/her life, but we should not treat this as a disorder. Most of public ,including my parents ,have bad attitude to computer game, addiction is a problem but not computer games.

Anyway, it's still a good idea to go outside rather than play computer games indoor.

Sunday, March 19, 2017

Prompt 10: Recognizing Truth

By Xavier Reinders

As with many fields and focuses Computer Science and Engineering (CSE) is subject to misconceptions and misrepresentations in popular media. Being as broad as it is CSE runs into a large amount of these problems, from over generalization, to sensationalizing, and straight up stereotyping. In this particular post I can not cover every problem out there or even name every piece that contains a problem like this, so for right now I am going to use a single example to point out some flaws that are common with the representation of the CSE field. To do so I am going to analyze the popular TV show "Mr. Robot" and highlight how it gets some things right and others wrong.

First things first, sensationalizing, or the act of presenting something in a way that provokes interest in the public while sacrificing accuracy. Does Mr. Robot do it right or wrong?  Right actually, while it is a little grandiose, it stays within the general logic limits of CSE and the exaggeration only comes in the theatrics and not the method.

So a determined hacker sits at his computer and pounds on the keys millions of lines a minute to retrieve one single line of information to hand off to the super spy who goes off to save the world right? Wrong, as a human being you are already behind in the CSE world, you are slower and less educated on the system than the system itself. So what does a real hacker do? They write a program, make a computer to outsmart the other computer. Basically a professional hacker, which is a thing, creates a program that does all the computation for them and just tells it which computing to do. Mr. Robot shows this well as you do not see him sitting down at his computer to race the system but rather he runs programs to beat the system.

Next, lets look at representation, Mr. Robot in the show is a cooperation hating, anti-social, save the world vigilante working for a hacker group trying to bring down the cooperations that run the world.  Although I'm sure some in the CSE field are like this, just like there are probably some in every profession,  it is not a large or incredibly pronounced group.  Most people in CSE work for some company or another, not an underground hacker cell trying the Fight Club Approach™, and are generally law abiding citizens who work IT, system management or are programmers. However professional hackers are a thing as I said above and Mr. Robot actually works as a legal one in the show, their job is to try and break a companies security so the company can plug any holes.

Like with the hate the world sentiment above there are a fair few in CSE that are anti-social but most are pretty social people and some are even famous, Zuckerberg and Gates are good examples. Also as a general rule CSE majors do not get to be alone while they work, a majority will work in groups or project teams and because of their broad reach will have to interact with all parts of a company or outside customers.  So while we may not be the most social group we also do not spend all our time in a dark hoodie with only our computer screens to light the way and general hatred for the world to fuel us.

An Introduction to AMD's Ryzen

By Steven Wang

If you are into computers at all, you probably have heard about AMD's new Ryzen line of processors that has demonstrated groundbreaking performance. It has been hyped for all while now but what does this all mean?

First off, what even is a processor and why is it so important? The central processing unit or CPU is essential the brains of the computer that uses electric signals to carry out interactions of a program by performing arithmetic, logical, and control operations specified by other parts or the user. In a computer, the CPU is located inside the motherboard under a fan to keep in cool.
Example of a CPU that has been properly installed 
What a CPU looks like (Intel)
For years, Intel has dominated the CPU market and is mainly known for its "core" line of CPUs such as the core i3, i5, and i7. Each has its own unique features such as number of cores and clock speed and are usually higher as you go up the line. In basic terms, the more cores a CPU has the more data it handle and higher the clock speed means it can compute tasks faster. Although AMD generally has more cores in their CPUs, Intel processors are still faster per core. When building or choosing a computer to purchase, often times people are recommended AMD processors only if you're on a budget. With the addition of Ryzen, however, the competition may change.

On March 2nd, three versions of the Ryzen 7 were released to the public: the 1700, 1700X, and the 1800X. Specifications and prices can be found in the image below:


Comparing to Intel's top of the line Core i7-6900k with similar specifications (eight cores and base clock speed of 3.2 gigahertz) priced at $1099, its hard not to go with AMD. Even its based model, which is nearly a third cheaper, it has the same number cores albeit with a lower base clock speed. Tests have also been done and proves AMD Ryzen can sometimes match or even outperform the 6900k in many tasks such as gaming and content creating.

These are extremely high end CPU and are targeted to consumers looking for the best of the best. However, if the rumors are true, AMD is planning on releasing lower end parts for the more average consumer and to compete with Intel's core i3 and i5.

Want to learn more about Ryzen? Watch a full detailed video here!
Links to benchmarks and other shenanigans:
1. UserBenchmark
2. Builds with the CPU
3. Pricing and Availability 




What is DirectX?

By Qingyang Li

Back to the time when computers were using Windows XP system,  at that time, when someone got a new computer, he had to install some basic software by himself, especially when run a computer game at first time, sometimes it would show a error that said you have to install a software named DirectX. This error is not common after windows 7 system, because almost every computers have already automatic installed this software.Anyway, why this software is so important that we can't run game on PC without it?

DirectX, first released in 1995,full name is Microsoft DirectX and is a collection of application programming interfaces(APIs) for handling tasks related to multimedia, especially game programming and video, on Microsoft platforms. Originally, the names of these APIs all began with Direct, such as Direct3D, DirectDraw, DirectMusic, DirectPlay, DirectSound, and so forth. The name DirectX was coined as a shorthand term for all of thses APIs (the X standing in for the particular API names) and soon became the name of the collection. When Microsoft later set out to develop a gaming console, the X was used as the basis of the name Xbox to indicate that the console was based on DirectX technology. The X initial has been carried forward in the naming of APIs designed for the Xbox such as Xlnput and the Cross-platform Audio Creation Tool(XACT), while the DirectX pattern has been continued for Windows APIs such as Direct2D and DirectWrite.

Direct3D (the 3D graphics API within DirectX) is widely used in the development of Video games for Microsoft Windows and the Xbox line of consoles. Direct3D is also used by other software applications for visualization and graphics tasks such as CAD/CAM engineering. As Direct3D is the most widely publicized component of DirectX, it is common to see the names " DirectX" and "Direct3D" used interchangeably.

Summary, DirectX allows software to write instructions directly to audio and video hardware, improving multimedia performance.Game that include DirectX have the capability of utilizing multimedia and graphics accelerator features more efficiently.

The DirectX software development kit(SDK) is available as a free download.While the runtimes are proprietary , closed-source software, source code is provided for most of the SDK samples. Starting with the release of Windows 8 Developer Preview, DirectX SDK has been integrated into Windows SDK. So this is why after Windows 7 ,we are no longer need to install DirectX by ourselves.



How Video Game Companies Have Changed Their Marketing

By Tanner Fred

In the early games of videogames, when they were nowhere near as popular as they are today, most companies made their as you would expect, through game sales. But as the companies have developed, so has their marketing strategies. Now most of the money that companies make are through microtransactions and DLC.

Now you're probably wondering what those words mean. In games there are instances were you can make purchases that will help to improve the gameplay. For example, in Call of Duty you can be crates that will unlock different weapon skins or character customization options that won't necessarily make you better at the game, but could make the game more fun. As these visual parts of a character have become a more important part of the gaming community, companies have realized that they could use this as a way to make additional profit. These instances would be considered microtransactions.




Now DLC is a bit different. DLC is an acronym that stands for downloadable content. What this means is that a company will release additional content to the game that either extends the games story or adds additional multiplayer options. For example, again using Call of Duty, they will release the base (called vanilla) version of the game and then will sell four additional multiplayer maps as DLC for $20. They will normally do this four times a year, or give you the option to buy the season pass where you will get all of the DLC for $50. Companies have made an absurd amount of money off of this strategy and at this rate will continue to do so.

Image result for black ops 3 season pass

Prompt 4: Considering Delivery and Style

By Tanner Fred

When reading through the blog of University of Texas student Eric Lee's blog, there are a lot of things that I learned when comparing to a scholarly written article or journal. Even if a blog is considered to be delivered in a serious matter, it is very difficult to deliver a blog in a professional manner that would be used by professionals. When professionals deliver their findings or the information that they have, it is mostly used in an article of some sort, an essay, or a journal.

The reason that the article version, written by professionals in the field, is more reliable and trusted is because these people have experience in the field and have more extensive knowledge of the information in the field. Although the writer of the blog, in this case Mr. Lee, may be quite knowledgeable on the subject, it is not comparable to somebody with years of professional experience. The audience of the two posts are also quite different though due to the method of delivery. The audience of the blog post by the student would be directed more at other college students, as well as possibly high school students that are considering entering the field. Meanwhile, an article written by a professional would most likely be to publish their work and therefore would be written for other professionals in their field. When reading through a blog we learn how the field is viewed from the perspective of somebody who has an outside view of the field, while with the article we have a much more professionally written perspective view from somebody in the field.

Click here to see Eric Lee's blog.
Click here to see the professionally written article.

Sunday, March 5, 2017

Brain-Computer Interface and Mind Controlled Prosthetics

By Xavier Reinders

Jaimie Henderson and Krishna Shenoy of the Stanford research team
The keyboard may be a thing of the past according to a Stanford research effort. Well not really but your brain could be the keyboard of the future. The investigation, detailed here, demonstrated that they could create a brain to computer interface allowing the user to type with their thoughts.

So who cares? Firstly, you do because that is awesome and it is the future people have been talking about in sci-fi novels, movies and comics since the introduction of computers. Secondly, the entire investigation was based on helping those who are paralyzed to better interact with technology. Due to their paralysis it was impossible for these people to interact with technology and very difficult for them to interact with people. With this advancement it becomes easier for them to interact with the outside world through computers. The investigation specifically looked at typing with your mind a much more complex task than the mind controlled prosthetics discussed later. Instead of giving a basic command like "move" or "open" or "close" the researchers were able to give a command like "type s in this field." That is comparable to attempting to bend just one knuckle in a finger. They had to be able to pick out on specific brain pulse out of the millions that are occurring and relay just that one to the computer.

On a less medicinal note this is an amazing break through for computers as a whole. Creating an interface where our brain can instruct a machine is the pathway to advanced cybernetics, awesome video experiences and incredible ease of machinery.  Instead of sending a signal to your fingers and arms to move and hit keys your brain could send a signal to a computer that you want that letter put where ever and skip all the extras. We could develop exo-suits allowing paralyzed people to walk again or create robots controlled by our minds to do work for us. This could even be used to create entirely new body parts for people that do not have them and have them function as they would if they were flesh and muscle instead of fiber and metal.

In fact such a step has already been taken into practice with the existence of mind controlled prosthetics. Before these brain-computer links were developed the revolution in prosthetics was flex sensing, moving a prosthetic based on the measurement of muscles that were still there that would have caused this movement. i.e. if you lost the lower half of your arm, sensors would detect muscle movement in your upper arm and translate it to the prosthetic as if it were actually connected to those muscles normally. Now as computers and our understanding of the human brain improve prosthetics can be created that link to your nerves and brain pathways rather than your muscles. So like I said above a user might think move and the prosthetic would respond, this is an important step for people who lost the whole limb and don't have the remaining muscles to operate a muscle sense prosthetic. In fact these devices are at a point where thoughts can be used to preform the precise movements needed to operate a compute mouse and keyboard.

In the less developed field of feedback researchers and engineers alike are working to transfer information from sensors back into the brain in a way that the brain can interpret as a touch. These combined with the mind controlled prosthetics could lead to full robotic recovery of lost limbs, hands that not only grasp and hold but feel.

What Language Started It All?

John Backus
Behind any command of a programming language, there is code for it underneath. I was
working on a project for my programming class one night and the question suddenly hit me. Who created the first programming language and what was it like compared to modern day code?

FORTRAN was the world's first high level programming language and was developed at the International Business Machine Corporation (IBM) by a team led by John Backus, an American computer scientist. The language was released in 1957 and surprisingly, is still used today for large scale numerical calculations in science and engineering.

The initial release of FORTRAN contained 32 statements, examples are:
- if
- go to
- do
- write
- print
- continue
The six example statements listed above are still used in modern day languages such as Python, Java, and C++ and just shows how advanced FORTRAN was for its time.

Before the development of disk files, text editors and terminals, programs were often entered on punch card using a keypunch keyboard. A punch card with code would look something like this:


One punch card
Looking at the card closely, we can see there are nine rows and eighty columns of numbers. A keypunch keyboard would punch holes in specific places based on the code needed. Compared to a terminal or text editor, the card looks extremely confusing so in order to enhance readability, the columns were divided into four sections:
- 1 to 5 were used to determine command statements such as the ones listed above
- 6 was a continuation field. If a character other than a blank or zero was punched in, this caused the card to be taken as a part of another card. (Multiple cards can be used for one program)
- 7 to 72 were used for statements or the body of your code in today's era
- 73 to 80 were ignored since, at the time, the IBM could only read 72 columns. Since these columns were ignored, programmers could write on them in order to organize the cards if many were used.
When the punch cards were finally finished, they would be fed into a card reader for the computer to compile and run.

 Looking back, it seems like the only main difference between FORTRAN and modern day languages were the speed at which code could be read and functionality. Compared to a text editor, which can compile code in an near instant, the punch card method could take hours to create if huge calculations were need to be done. The cards would then have to be fed individually into the reader which could also take some time. FORTRAN was also mainly used for calculations, however, programming today takes on many other fields such as video game design and web development. Obviously, many changes were implement to commands as well as new ones added, but it shows how far computer science has come while still incorporating its original DNA.


Fun facts about computers

By Qingyang Li

Nowadays, computers have become the most important tools in our daily life. They are completely change our lives in so many ways. Although computers have been developed in decades, there are still some fun facts you may not know,and here I am going to introduce some of them to you.

1. The first electronic computer ENIAC weighted more than 27 tons and took up 1800 square feet.

2. Only about 10% of the world 's currency is physical money ,the rest only exists on computers.

3. Typewriter is the longest word that you can write using the letters only on one row of the keyboard of your computer.

4. Doug Engelbart invented the first computer mouse in around 1964 which was made of wood.

5. There are more than 5000 new computer viruses are released every month.

6. Around 50% of all Wikipedia vandalism is caught by a single computer program with more than 90% accuracy.

7. If there was a computer as powerful as the human brain, it would be able to do 38 thousand trillion operations per second and hold more than 3580 terabytes of memory.

8. The password for the computer controls of nuclear tipped missiles of the US was 00000000 for eight years.

9. Approximately 70% of virus writers are said to work under contract for organized crime syndicates.

10. HP,Microsoft and Apple have one very interesting thing in common-they were all started in a garage.

11. An average person normally blinks 20 times a minute ,but when using a computer he/she blinks only 7 times a minute.

12. The house where Bill Gates lives,was designed using a Macintosh computer.

13. The first ever hard disk drive was made in 1979, and could hold only 5MB of data.

14. The first 1GB hard disk drive was announced in 1980 which weighted about 550 pounds, and the price was 40,000 dollars.

15. More than 80% of the emails sent daily are spams.

16. A group of 12 engineers designed IBM PC and they were called as " The Dirty Dozen."

17. The original name of windows was interface Manager.

18. The first microprocessor created by Intel was the 4004. It was designed for a calculator ,and in that time nobody imagined where is would lead.

19. IBM 5120 from 1980 was the heaviest desktop computer ever made. It weighed about 105 pounds, not including the 130 pounds external floppy drive.

20. Genesis Device demonstration video in Star Trek 2: the Wrath of Khan was the first entirely computer generated movie sequence in the history of cinema . That studio later become Pixar.