Processing Large CSV
Start Timer
0:00:00
Let’s say that you’re trying to run some data processing and cleaning on a .csv file that’s currently 100GB large.
You realize it’s too big to store in memory, so you can’t clean the file using pandas.read_csv(). How would you get around this issue and clean the data?
.
.
.
.
.
.
.
.
.
Comments