Megabyte
Multiplebyte units  


 
Orders of magnitude of data 
The megabyte is a multiple of the unit byte for digital information. Its recommended unit symbol is MB. The unit prefix mega is a multiplier of 1000000 (10^{6}) in the International System of Units (SI). Therefore, one megabyte is one million bytes of information. This definition has been incorporated into the International System of Quantities.
In the computer and information technology fields, other definitions have been used that arose for historical reasons of convenience. A common usage has been to designate one megabyte as 1048576bytes (2^{20} B), a quantity that conveniently expresses the binary architecture of digital computer memory. The standards bodies have deprecated this usage of the megabyte in favor of a new set of binary prefixes, in which this quantity is designated by the unit mebibyte (MiB).
Definitions
The unit megabyte is commonly used for 1000^{2} (one million) bytes or 1024^{2} bytes. The interpretation of using base 1024 originated as technical jargon for the byte multiples that needed to be expressed by the powers of 2 but lacked a convenient name. As 1024 (2^{10}) approximates 1000 (10^{3}), roughly corresponding to the SI prefix kilo, it was a convenient term to denote the binary multiple. In 1999, the International Electrotechnical Commission (IEC) published standards for binary prefixes requiring the use of megabyte to denote 1000^{2} bytes, and mebibyte to denote 1024^{2} bytes. By the end of 2009, the IEC Standard had been adopted by the IEEE, EU, ISO and NIST. Nevertheless, the term megabyte continues to be widely used with different meanings.
 Base 10
 1 MB = 1000000 bytes (= 1000^{2} B = 10^{6} B) is the definition following the rules of the International System of Units (SI), and the International Electrotechnical Commission (IEC). This definition is used in computer networking contexts and most storage media, particularly hard drives, flashbased storage, and DVDs, and is also consistent with the other uses of the SI prefix in computing, such as CPU clock speeds or measures of performance. The Mac OS X 10.6 file manager is a notable example of this usage in software. Since Snow Leopard, file sizes are reported in decimal units.
In this convention, one thousand megabytes (1000 MB) is equal to one gigabyte (1 GB), where 1 GB is one billion bytes.
 Base 2
 1 MB = 1048576 bytes (= 1024^{2} B = 2^{20} B) is the definition used by Microsoft Windows in reference to computer memory, such as randomaccess memory (RAM). This definition is synonymous with the unambiguous binary unit mebibyte. In this convention, one thousand and twentyfour megabytes (1024 MB) is equal to one gigabyte (1 GB), where 1 GB is 1024^{3} bytes (i.e., 1 GiB).
 Mixed
 1 MB = 1024000 bytes (= 1000×1024 B) is the definition used to describe the formatted capacity of the 1.44 MB 3.5inch HD floppy disk, which actually has a capacity of 1474560bytes.
Randomly addressable semiconductor memory doubles in size for each address lane added to an integrated circuit package, which favors counts that are powers of two. The capacity of a disk drive is the product of the sector size, number of sectors per track, number of tracks per side, and the number of disk platters in the drive. Changes in any of these factors would not usually double the size.
Examples of use
Depending on compression methods and file format, a megabyte of data can roughly be:
 a 1megapixel bitmap image (e.g. ~1152 × 864) with 256 colors (8 bits/pixel color depth) stored without any compression.
 6seconds of 44.1 kHz/16 bit uncompressed CD audio.
 1minute of 128kbit/s MP3 lossy compressed audio.
 a typical English book volume in plain text format (500 pages × 2000 characters per page).
The human genome consists of DNA representing 800MB of data. The parts that differentiate one person from another can be compressed to 4MB.