Wednesday 1 September 2010

Intro to Programming

Programming needs me to learn Java so I will stop here to continue with PHP and MySql first and when I have free time and have finished with Php and Mysql will return to Java and starting programming

Intro to Programming 1

The introduction to computer science Wikiversity course if obviously very underdeveloped so I went on with the 'Introduction to Programming'' hoping it won't suffer the same issues...

----

A program is a set of instructions that tell a computer how to do a task. When a computer follows the instructions in a program, we say it executes the program. You can think of it like a recipe that tells you how to make a peanut butter sandwich. In this model, you are the computer, making a sandwich is the task, and the recipe is the program that tells you how to execute the task.

  • Activity: Come up with a set of instructions to tell someone how to make a peanut butter sandwich. Don't leave any steps out, or put them in the wrong order.

Was that easy? Did you remember all the steps? Maybe you forgot to tell me to use a knife to spread the peanut butter. Now I've got peanut butter all over my hands! Of course, you say, a person wouldn't be that dumb. But a computer is that dumb. A computer will only do what you tell it to do. This might make programming frustrating at first, but it's relieving in a way: if you do everything right, you know exactly what the computer is going to do, because you told it.

Of course, computers don't understand recipes written on paper. Computers are machines, and at the most basic level, they are a collection of switches—where 1 represents "on" and 0 represents "off". Everything that a computer does is implemented in this most basic of all numbering systems—binary. If you really wanted to tell a computer what to do directly, you'd have to talk to it in binary, giving it coded sets of 1s and 0s that tell it which instructions to execute. However, this is nearly impossible. In practice, we use a programming language.

Intro to Computer Science 4

Actually, the article on algorithms reminded Im a huge maths dummy and that the Greek State has given me the high school certificate only cause I was cute back then :/

So I purchased a book (from the 'For Dummies' series) on Basic Maths and will study it but I will not stop the 'Intro to Computer Science' since I only do it for fun and for general rounded education related to my formal study of XHMLT and CSS in my University.

(panic is now over.... maths always ALWAYS make me get the most horrible panicks, as if someone entered my flat just now and shouted ''exam today kids!'')

Intro to Computer Science 3

What is an 'algorithm': a set of instructions to solve a problem STEP BY STEP.

Example of algorithm to solve the problem of frying eggs for one person:

1. Takes two eggs out of the fridge
2 Heat the oil in the pan so that the temp rises enough but not too much
3 Throw the eggs in by breaking them off their shells
4 Let them to fry till they are robust but not burnt
5 When robust remove them from the pan to place them on a plate
6 Close the pan fire


A more complex and correct algorithm is one that expects errors and misses. The following is from wikipedia on algorithms:

1. Get the frying pan.
2. Get the oil.
a. Do you have oil?
1) If yes, put it in the pan.
2) If no, do you want to buy oil?
a) If yes, the go out and buy.
b) If no, you can terminate.
3. Turn on the stove.
(etc)


Intro to Computer Science 2

A summary of the article 'computer' in wikipedia:

A computer is a machine. It receives data that it can then manipulate and output them in meaninful for humans ways. A machine does not itself understand the meaning (semanstic value) of data it stores and manipulates.

The first electronic computers were made in mid 20nth century in USA.

Computers now exist in most devices like watches, games etc. The first computer was an entire room, now it has evolved into a small portable device, the 'personal computer' , AKA PC.

A computer is not necessarily an electronic device. It can be any instrument that re-interprets inserted information. Thus computers have existed since ancient times in various formats and from these first utensils numbers where abstracted ( and the discipline of mathematics was created out of trying to measure the physical world, that is, with basic computers on wooden and bone sticks that recorded data for re-interpretation and communication of these data as accurately as possible. But once numbers where abstracted from physical devices, it was much easier to manipulate them although the date produced from these calculations were not always interpreted in the physical world, to make Architecture for example).

Personal computers (computer for one person) began in the 1970's and by 1975 they were much smaller because of the invention of the microprocessor hence they were also reduced in price and were more affordable (the first personal computers were extremely expensive to purchase).

Until the end of 80s there was a race for smaller and more easy to use PERSONAL computing and in the 90's the pc market took off and computers became 'mainstream' and not something a few people owned and understood.

Today PCs are embedded even in other personal devices, like music personal devices, watches, mobile phones, etc.

Tuesday 31 August 2010

Introduction to Computer Science 1

Summaries for the articles required to be read in part 1 (of 12 parts) for the course 'Introduction to Computer Science)

Summary of article ''History of Computation"

The first modern day concept of computer was developed in Harvard University (a prominent University in USA) in the 40's (article on this)

Nevertheless, the history of computing itself started very early in human history, as early as prehistory (the tally sticks as device for computing). The tally sticks were still used for rural commerce agreements in Europe(!) up to the end of 19th century.

The story follows as how the archaeologists have unearthed devices for computing from prehistory to the first digital (and not analog anymore) computer in 1940 and '50s.

Summary of article 'The History of Computing Hardware'

In this article, even our fingers in collaboration with a method for counting is reffered as a prototype of computing hardware.

(to be continued after some clarifications with the other students/tutors and familiarizing myself more with the general workings of the wiki website technology, I only know how to push a button to add my siganture name and date, so far.)