Processing Large CSV
Start Timer
0:00:00
Let’s say that you’re trying to run some data processing and cleaning on a .csv
file that’s currently 100GB large.
You realize it’s too big to store in memory, so you can’t clean the file using pandas.read_csv()
. How would you get around this issue and clean the data?
Recommended questions for you
Personalized based on your user activity, skill level, and preferences.
.
.
.
.
.
.
.
.
.
Comments