XML file, data more than 2147483648 bytes
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
@VA hm, something seems strange. Prior to the batch macro, what are you passing into the batching macro? It'd be useful to understand why the cell is so large? What does your macro normally do?
Normally you'd see a list of file paths and the batch macro would be used to bring in each of them and process them - one at a time.
All the best,
BS
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
You are beyond the limit of an int32 in a single column - in a single row - in a single field. there are many systems which will flag that amount. I'd recommnd figuring out an alternative way to parse this.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
Im using "V_WString"
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
@VA is there any potential to open the original file in Notepad++ and save it as a .txt? Then you can read it in as .txt. I guess you can then consider ways to start parsing it and leverage the XML parse tool. Although I'm not sure if that's the best solution.
All the best,
BS
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
@VA this post does suggest you should be fine with reading in a large XML file: https://community.alteryx.com/t5/Alteryx-Designer-Desktop-Discussions/XML-file-size-limitations/td-p...
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
@BS_THE_ANALYST - this is a single field which is exceeding the vwstring limit - not the entire file. I would posit that VA is trying to tag as an xml value something which isn't an xml value. like you can't store an entire book in a single value.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
@apathetichell so it's not so much the file size, it's that the user is parsing a column that contains a very long value? If that was the case, I would have thought the process would continue but that particular row in the field would display as truncated. The user's workflow is just batch-reading XML files, right? I imagined this is acting along similar lines to the XML parse tool.
I'm stumped without more info.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
