Input Tool CSV limits - only loading 100,000 rows
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
Hi
I am having a strange problem that alteryx is only letting me import 100,000 rows in the Input Tool when connecting to a .csv file locally on my machine.
I have searched but have not found any reference to any limitations of size restrictions that should be causing this.
Does anyone have any ideas?
Cheers
Solved! Go to Solution.
- Labels:
- Input
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
Seems strange, I've never had this issue.
Are you using a browse tool to view your data?
Are you sure you haven't set a record limit in the input tool?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
Yes it does seem strange,,,
The browse tool doesn't seem to be the issue, its whats coming from the input tool like so:
No record limit.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
This is going to sound stupid, but can we be sure the file isn't 100,000 records in length!
Ben
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
worth the question, its coming up to midnight...
no unfortunately not 184,903,890 rows but not many columns...
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
With the workflow you showed, as long as there isn't a record limit on your input there is no reason why alteryx would stop it at 100,000 records.
Can I ask two questions: How big is your raw CSV's file size on the disk (right click on the file in Windows Explorer and select Properties to view the file size)? Have you loaded it back into SQL or into another program to confirm that it is indeed 180 million records?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
Hi David
Thanks for responding...
There is 180 million rows...
I found the solution, unfortunately not ideal, but a restart of the machine and re-open the software has fixed the issue.
Cheers
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
Facing a similar issue. Have more than 50K rows in the source. They are grabbed using the "Input data" and the "Output data" seem to insert them all ( at least this is stated in the confirmation message). But, in the target table , I do get only 20K rows. The restart / reload solution did not help. Could you please advice ? Thanks!
