My last post was about SQL2CSV and 3 million records. Yes, I got that to work, but then I tried
Code: Select all
Readfile>file,result
Separate>result,delim,lines
I returned to my SQL and modified it to get 50,000 records at a time, resulting in 69 files of records. Now I'm trying to take those files and convert them into files of INSERT statements, so that I can load my data. (Bulk load isn't working right now, and that is a problem for another day).
The trouble I'm having is that readfile takes some time, depending on how big the file is. When I use the code above, separate runs immediately, and I'm wondering whether it waits for Readfile to finish, or if it is asynchronous, and things are getting jumbled and broken because I'm not testing to see if Readfile is done. Basically, what happens is that MS gets "lost" and appears as if it is not responding.
I'm stepping through it now, and this is what I see:
While the stepper has moved on to the next line, I don't see the result, my_csv, in the Watch List. Also, why is CF_RESULT in there?
I'm not copying, moving or renaming any file. Is that from the DELETEFILE> that I do?CF_RESULT
False if a CopyFile, MoveFile or RenameFile command was aborted
CF_RESULT_CODE
Numeric result code of a CopyFile, MoveFile or RenameFile command (0 is success).
Maybe ReadLn is more appropriate, I'll see. But I wonder if anybody has insight into using these functions with large files of text.
I'll post what I find out.