Microsoft Excel has a row limit of 1,048,576 rows (2²⁰), so if you have more than 11 lakh (1.1 million) rows, Excel won’t be able to handle it in a single worksheet.
✅ Solutions:
Here are your best options for handling more than 1 million rows of data:
1. Use a Database (Recommended)
Move your data to a database like:
- MySQL / MariaDB
- PostgreSQL
- SQLite
- Microsoft Access (up to 2 GB limit)
- Google BigQuery (for massive data)
You can then run SQL queries efficiently and even export manageable chunks to Excel.
2. Use a Programming Language
If you’re comfortable with programming:
- Python (Pandas) or R can handle millions of rows easily.
- You can load, analyze, and even export to multiple Excel sheets or CSVs.
import pandas as pd
# Load large CSV
df = pd.read_csv("your_large_file.csv")
# Split into multiple files
for i, chunk in enumerate(range(0, len(df), 1000000)):
df[chunk:chunk+1000000].to_excel(f"output_part_{i+1}.xlsx", index=False)
3. Use CSV instead of XLSX
CSV files can store any number of rows (limited only by system memory).
- Open with a text editor, code, or tools like Power BI / Tableau / Python.
4. Power BI or Google Data Studio
These tools are designed to analyze large datasets and support millions of rows from sources like Excel, SQL, Google Sheets, etc.
5. Split Excel Files
If you must use Excel, split your data into multiple files or sheets, each under the 1 million row limit.