Tuesday, June 4, 2024

How Does The Brain Interpret Computer Languages? 


In the US, a 2016 Gallup poll found that the majority of schools want to start teaching code, with 66 percent of K-12 school principals thinking that computer science learning should be incorporated into other subjects. Most countries in Europe have added coding classes and computer science to their school curricula, with France and Spain introducing theirs in 2015.

This new generation of coders is expected to boost the worldwide developer population from 23.9 million in 2019 to 28.7 million in 2024. Despite all this effort, there’s still some confusion on how to teach coding. Is it more like a language, or more like math? Some new research may have settled this question by watching the brain’s activity while subjects read Python code.

Right now, there are two schools of thought. The prevailing one is that coding is a type of language, with its own grammar rules and syntax that must be followed. After all, they’re called coding languages for a reason, right? This idea even has its own snazzy acronym: Coding as Another Language, or CAL…Continue reading

By:  

Source: How does the brain interpret computer languages? | Ars Technica

.

Critics:

Coding theory is the study of the properties of codes and their respective fitness for specific applications. Codes are used for data compressioncryptographyerror detection and correctiondata transmission and data storage.

Codes are studied by various scientific disciplines—such as information theoryelectrical engineeringmathematicslinguistics, and computer science—for the purpose of designing efficient and reliable data transmission methods.

This typically involves the removal of redundancy and the correction or detection of errors in the transmitted data. There are four types of coding:

  1. Data compression (or source coding)
  2. Error control (or channel coding)
  3. Cryptographic coding
  4. Line coding

Data compression attempts to remove unwanted redundancy from the data from a source in order to transmit it more efficiently. For example, ZIP data compression makes data files smaller, for purposes such as to reduce Internet traffic. Data compression and error correction may be studied in combination.

Error correction adds useful redundancy to the data from a source to make the transmission more robust to disturbances present on the transmission channel. The ordinary user may not be aware of many applications using error correction. A typical music compact disc (CD) uses the Reed–Solomon code to correct for scratches and dust. In this application the transmission channel is the CD itself.

Cell phones also use coding techniques to correct for the fading and noise of high frequency radio transmission. Data modems, telephone transmissions, and the NASA Deep Space Network all employ channel coding techniques to get the bits through, for example the turbo code and LDPC codes.

In 1948, Claude Shannon published “A Mathematical Theory of Communication“, an article in two parts in the July and October issues of the Bell System Technical Journal. This work focuses on the problem of how best to encode the information a sender wants to transmit. In this fundamental work he used tools in probability theory, developed by Norbert Wiener, which were in their nascent stages of being applied to communication theory at that time.

Shannon developed information entropy as a measure for the uncertainty in a message while essentially inventing the field of information theory. The binary Golay code was developed in 1949. It is an error-correcting code capable of correcting up to three errors in each 24-bit word, and detecting a fourth.

Richard Hamming won the Turing Award in 1968 for his work at Bell Labs in numerical methods, automatic coding systems, and error-detecting and error-correcting codes. He invented the concepts known as Hamming codesHamming windowsHamming numbers, and Hamming distance.

In 1972, Nasir Ahmed proposed the discrete cosine transform (DCT), which he developed with T. Natarajan and K. R. Rao in 1973. The DCT is the most widely used lossy compression algorithm, the basis for multimedia formats such as JPEGMPEG and MP3Cryptography or cryptographic coding is the practice and study of techniques for secure communication in the presence of third parties (called adversaries).

More generally, it is about constructing and analyzing protocols that block adversaries; various aspects in information security such as data confidentialitydata integrityauthentication, and non-repudiation are central to modern cryptography. Modern cryptography exists at the intersection of the disciplines of mathematicscomputer science, and electrical engineering.

Applications of cryptography include ATM cardscomputer passwords, and electronic commerceCryptography prior to the modern age was effectively synonymous with encryption, the conversion of information from a readable state to apparent nonsense.

The originator of an encrypted message shared the decoding technique needed to recover the original information only with intended recipients, thereby precluding unwanted persons from doing the same. Since World War I and the advent of the computer, the methods used to carry out cryptology have become increasingly complex and its application more widespread.

Modern cryptography is heavily based on mathematical theory and computer science practice; cryptographic algorithms are designed around computational hardness assumptions, making such algorithms hard to break in practice by any adversary. It is theoretically possible to break such a system, but it is infeasible to do so by any known practical means.

These schemes are therefore termed computationally secure; theoretical advances, e.g., improvements in integer factorization algorithms, and faster computing technology require these solutions to be continually adapted. There exist information-theoretically secure schemes that provably cannot be broken even with unlimited computing power—an example is the one-time pad—but these schemes are more difficult to implement than the best theoretically breakable but computationally secure mechanisms.

Another concern of coding theory is designing codes that help synchronization. A code may be designed so that a phase shift can be easily detected and corrected and that multiple signals can be sent on the same channel. Another application of codes, used in some mobile phone systems, is code-division multiple access (CDMA). Each phone is assigned a code sequence that is approximately uncorrelated with the codes of other phones.

No comments:

Post a Comment

NexusGPT The AI Assistant Contactless Digital Business Card Creator Platform

Credit to:  arminhamidian Choose, create & manage No-Touch Digital Business cards from Done-for-You Templates, Share Profile, Manage Lin...