Big Data Files
Options
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
BlueJean
5 - Atom
06-06-2023
02:40 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
In my first DS project. Newbie for sure.
I have two datasets with over a million points each. Can anyone tell me if there is a way to convert or compress those .csv files so that they run more smoothly in Python? Any help would be greatly appreciated.
Thanks in advance.
Labels:
- Labels:
- Question
1 REPLY 1
acarter881
12 - Quasar
06-23-2023
09:56 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
Hello, @BlueJean.
Are you reading the files into Alteryx? Is it slow? If so, have you tried caching the Input Data tools? This will convert those to a temporary yxdb so the workflow run time will be significantly reduced.
Attached are a few other suggestions related to Python.
