Megabytes to Bytes

Convert megabytes to bytes for RAM sizing, hardware specs, and data budgets. 1 MB = 1,000,000 bytes — enter any MB value for exact byte count.

Enter your values above to see the results.

Tips & Notes

  • RAM is always specified in binary MiB. 8 GB RAM = 8 GiB = 8,589,934,592 bytes. 16 GB = 17,179,869,184 bytes. When programming with memory buffers, always use binary multiples: 64 MiB = 67,108,864 bytes, not 64,000,000 bytes.
  • Software installation sizes: Windows 11 minimum 64 GB = 64,000,000,000 bytes (decimal). Office 365 install ≈ 4,000 MB = 4,000,000,000 bytes. Adobe Photoshop ≈ 2,600 MB = 2,600,000,000 bytes. These sizes are declared in decimal MB by vendors.
  • GPU VRAM is binary: an "8 GB" GPU has 8,589,934,592 bytes of video memory. A 4K texture at 32-bit color = 3840 × 2160 × 4 bytes = 33,177,600 bytes ≈ 31.6 MiB per texture. Gaming at 4K requires VRAM to hold multiple large textures simultaneously.
  • Database and data science: a CSV with 1 million rows, 10 columns of 8-byte integers = 10,000,000 × 8 = 80,000,000 bytes = 80 MB. Pandas DataFrame in memory uses approximately 2-8× the raw CSV size due to Python object overhead and indexing.
  • Mobile data plans: 1 GB = 1,000 MB = 1,000,000,000 bytes (decimal, as used by carriers). Streaming at 720p Netflix (≈1.5 Mbps) uses 1.5/8 = 0.1875 MB/s = 675 MB/hour. A 5 GB monthly plan = 5,000 MB = 5,000,000,000 bytes of cellular data.

Common Mistakes

  • Using 1,000,000 bytes/MB when the system uses binary — GPU drivers, OS memory managers, and firmware use binary MiB internally. A shader buffer of "128 MB" in a graphics API means 128 × 1,048,576 = 134,217,728 bytes, not 128,000,000 bytes.
  • Comparing vendor-stated MB sizes without checking the convention — storage manufacturers use decimal MB (1,000,000 bytes); OS tools use binary MiB (1,048,576 bytes). A "500 MB" hard drive partition shows as 476.8 MiB in Linux df output.
  • Calculating network transfer time with mixed units — converting "500 MB file over 10 Mbps connection": 500 MB = 500,000,000 bytes = 4,000,000,000 bits. Time = 4,000,000,000 / 10,000,000 = 400 seconds = 6.67 minutes. Using MiB (524,288,000 bytes = 4,194,304,000 bits) would give 419.4 seconds instead.
  • Sizing buffer allocations in exact decimal MB — malloc(100 * 1000000) allocates 100,000,000 bytes. But if the code expects 100 MiB for page-aligned processing, it should allocate 100 * 1048576 = 104,857,600 bytes. Using the wrong factor causes off-by-4,857,600 errors in memory-sensitive code.
  • Treating megabytes as a fixed unit in disk partitioning — partition sizes may be reported in decimal MB by some tools and binary MiB by others (fdisk, parted, GParted all differ). Always verify which unit the tool uses before calculating partition sizes for alignment.

Megabytes to Bytes Overview

Megabytes to bytes conversion bridges the human-readable scale of file sizes and memory specifications to the exact byte counts required by programming, hardware configuration, and storage planning. The 4.86% gap between decimal MB and binary MiB grows into significant errors at scale.

Megabytes to bytes formula:

Bytes = MB × 1,000,000 (decimal) | Bytes = MiB × 1,048,576 (binary) | Gap: 1 MiB − 1 MB = 48,576 extra bytes (4.86%)
EX: RAM 8 GiB → 8,192 MiB × 1,048,576 = 8,589,934,592 bytes. A "500 MB" download → 500,000,000 bytes (decimal). The byte difference between the two for "8 GB": 8,589,934,592 (binary) vs. 8,000,000,000 (decimal) = 589,934,592 bytes discrepancy (≈ 590 MB).
Decimal vs. binary — who uses which:
Decimal (1,000,000 B/MB): storage vendors, ISPs, mobile carriers, app stores | Binary (1,048,576 B/MiB): OS, RAM, firmware, programming
EX: iPhone "256 GB" storage = 256,000,000,000 bytes (decimal). iOS reports available space in decimal GB. Android may show GiB or GB depending on version. Always check which your analysis tools use before comparing figures.
RAM sizes in MiB and exact bytes:
RAM LabelExact BytesDecimal MBCommon Use
512 MiB536,870,912 B536.9 MBEmbedded systems
1 GiB1,073,741,824 B1,073.7 MBLow-end devices
4 GiB4,294,967,296 B4,294.9 MB32-bit max address
8 GiB8,589,934,592 B8,589.9 MBTypical laptop
16 GiB17,179,869,184 B17,179.9 MBWorkstation
64 GiB68,719,476,736 B68,719.5 MBHigh-end workstation
Software and media sizes in MB and bytes:
ItemStated SizeActual BytesConvention
Windows 11 ISO5.1 GB5,100,000,000 BDecimal
macOS Ventura12 GB12,000,000,000 BDecimal
iOS iPhone update1,500 MB1,500,000,000 BDecimal
GPU L2 cache (RTX 4090)72 MiB75,497,472 BBinary
L3 CPU cache (Ryzen 9)64 MiB67,108,864 BBinary
The megabytes-to-bytes conversion is a daily exercise in systems programming — every time code allocates a buffer ("allocate 100 MB for this cache"), the developer must know whether to use 100,000,000 or 104,857,600 bytes. Getting it wrong creates subtle bugs: 4,857,600 fewer bytes than intended may cause buffer overflows near the boundary, or 4,857,600 extra bytes may exceed available memory in constrained environments. The choice between decimal and binary is not arbitrary — it must match the expectations of the system, hardware, and protocol being worked with.

Frequently Asked Questions

Multiply MB by 1,000,000 (decimal) or MiB by 1,048,576 (binary). Examples: 1 MB = 1,000,000 bytes; 10 MB = 10,000,000 bytes; 100 MiB = 104,857,600 bytes; 256 MiB = 268,435,456 bytes (common RAM module size); 512 MB = 512,000,000 bytes (decimal); 1,024 MiB = 1,073,741,824 bytes = 1 GiB. For programming: always use 1,048,576 bytes per MiB when working with memory addresses and buffer sizes.

RAM uses binary MiB exclusively. Common RAM sizes and exact byte counts: 512 MiB = 536,870,912 bytes; 1 GiB = 1,073,741,824 bytes; 2 GiB = 2,147,483,648 bytes; 4 GiB = 4,294,967,296 bytes; 8 GiB = 8,589,934,592 bytes; 16 GiB = 17,179,869,184 bytes; 32 GiB = 34,359,738,368 bytes; 64 GiB = 68,719,476,736 bytes. A 64-bit pointer (8 bytes) can address 2⁶⁴ = 18,446,744,073,709,551,616 bytes = 16 exbibytes (EiB) of memory — far beyond current hardware.

Software sizes in bytes: Windows 11 ISO 5,100 MB = 5,100,000,000 bytes (5.1 GB decimal); macOS Ventura installer 12,000 MB = 12,000,000,000 bytes (12 GB decimal); iOS update typical 500-2,000 MB = 500,000,000-2,000,000,000 bytes; Android update 500-3,000 MB = 500,000,000-3,000,000,000 bytes; Microsoft Office 365 install 4,000-6,000 MB = 4,000,000,000-6,000,000,000 bytes; Python with common packages 500-2,000 MB. Mobile app stores quote sizes in decimal MB; device storage shows remaining space in decimal GB (iOS) or binary GiB (Android, variable).

Database size planning: 1 million rows of a simple user table (ID + email + timestamps ≈ 100 bytes/row) = 100,000,000 bytes = 100 MB. With indexes (B-tree overhead 30-50%): 130-150 MB total. PostgreSQL WAL (write-ahead log) default checkpoint: 16 MB = 16,000,000 bytes (decimal, one WAL segment). Redis memory: storing 1 million simple key-value pairs (strings, 50 bytes avg): approximately 100-200 MB total (Redis overhead per key). Elasticsearch index: document with 10 text fields averages 500-2,000 bytes; 1 million documents = 500 MB to 2 GB before inverted index overhead.

Video streaming data consumption: Netflix SD (480p) 300 MB/hr = 300,000,000 bytes/hr; Netflix HD (1080p) 3,000 MB/hr = 3,000,000,000 bytes/hr; Netflix 4K 7,000 MB/hr = 7,000,000,000 bytes/hr; YouTube 1080p ≈ 2,100 MB/hr; YouTube 4K ≈ 10,000 MB/hr; Spotify audio stream (320 kbps) 144 MB/hr = 144,000,000 bytes/hr; Zoom video call (1080p) 1,800 MB/hr. A 10 GB cellular data plan allows approximately 3.3 hours of 4K streaming or 33 hours of HD streaming.

Power-of-two sizes (128, 256, 512, 1024 MiB) appear throughout computing because binary arithmetic is exact for these values. Memory allocators prefer power-of-two sizes to reduce fragmentation. Examples: 256 MiB = 268,435,456 bytes; 512 MiB = 536,870,912 bytes; 1,024 MiB (1 GiB) = 1,073,741,824 bytes. CPU caches: L2 often 256 KiB or 512 KiB; L3 often 4, 8, 16, or 32 MiB — all binary powers. GPU VRAM: 4, 8, 16, 24, 48 GiB — also binary. Networking: TCP window size scales in binary powers; Ethernet jumbo frames at 9 KiB (9,216 bytes = 9 × 1,024). Whenever you see a round-number size in computing, it is almost always a binary power under the hood.