Encoding Scheme Definition and Types

What is Encoding?

Process of converting data from one form to another form is called encoding. In Computer Science encoding is converting data (numbers, alphabets, symbols, spaces, graphics etc.) into binary codes.

Why Encoding?

  • Platform Independency
  • Internationalization
  • Security
  • Effective Communication

Encoding Schemes

There are various standard encoding schemes each part of data is assigned a unique code. Some of the popular encoding schemes are mentioned below:

ASCII (American Standard Code for Information Interchange)

  • Formally began in 1960 and later promoted by ANSI
  • Standard common way to encode and represent keys of keyboard understood by every computer.
  • It Uses 7-bits to represent any character.
  • It represents (encodes) total 27= 128 characters. (Because of 7-bit representation)
  • It can represent (encode) character set of English language only.

ASCII Table

CharacterDecimal ValueCharacterDecimal ValueCharacterDecimal value
Space32@64,96
!33A65a97
34B66b98
#35C67c99
$36D68d100
%37E69e101
&38F70f102
39G71g103
(40H72h104
)41I73i105

ISCII (Indian Standard Code for Information Interchange)

  • Introduced by Bureau of Indian Standard (BIS) in 1991.
  • Standard encoding scheme to represent Indian Scripts.
  • It uses 8-bits to represent characters.
  • It represents total 28 = 256 characters.
  • It supports 10 different Indian languages which are: Devanagari, Punjabi, Bengali, Gujarati, Oriya, Telugu, Assamese, Kannada, Malayalam, Tamil and Roman.

UNICODE

  • It has been developed to represent all the characters of every written language of world.
  • It can represent near about 1,000,000 characters
  • It may be 8-bit, 16-bit or 32-bit
  • It is superset of ASCII
  • UTF-8, UTF-16 and UTF-32 are some common Unicode encodings among which UTF-8 is most commonly used.
error: Content is protected !!