I had trouble processing a large csv file recently because it was nearly 100Mb in size and it was not possible given the resources available in my laptop to process it and subsequently insert the whole lot into a database.
So I created a process to take the file and chop it up into smaller bits so I could process these and insert into the database. This took time but at least it finished.
Here is an example process to chop csv files. This creates a large csv file by way of illustration and then proceeds to split it using the "loop batches" operator.
Remove the "generate dummy data" and "write dummy data" operators and change the macro "fileToRead" in the context to point to the location of the file you want to read.