Hyper file size
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
Hey,
I have a wierd problem. I am using an output to a hyper file format. The file size is always a suprise. My workflow uses alteryx database an inputs and I am doing about three joins and removing blanks and null rows . The output is around 20 m rows and the file size is 1.3 gb. A day back with th same rows I had 3 more columns and the file size was 1.1 gb. I had a similar file size for a data with 145 m records and now at less than 20 percent of the data i still have a higher file size.
Can someone help me to help optimize the hyper fle size. Or an other suggestions. The output goes to feed a tableau data source.
Thanks
- Labels:
- Data Investigation
- Engine
- Tableau
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
Hi @SouravKayal - Here is what Tableau says: https://help.tableau.com/current/api/hyper_api/en-us/docs/hyper_api_defrag.html
You may use Alteryx instead of Python to rewrite the file though... 🙂
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
My recommendation is to reduce the number of column (keep only the ones you need for visualization).
try to use the auto field in order to optimize the column size.
Hope this helps!
Regards
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
can you share some more information onthisplease
