Alteryx Designer Desktop Discussions

Find answers, ask questions, and share expertise about Alteryx Designer Desktop and Intelligence Suite.
SOLVED

SAP HANA - In-DB tools - cannot allocate enough memory

ChristianStaerk
6 - Meteoroid

Hi everyone,

 

I am currently running into an issue, for which I have not found a suitable solution in the Forum yet.

I have data in a SAP HANA table and want to build up a new workflow, which extracts data from this one (approx. 14mio rows and 100 columns) via a join and combines it with new calculated data (500k rows, 100 columns).
The workflow is completely set up with In-db Tools except for a small table that I am Streaming in via the "data Stream in" tool to filter by a join.

 

The problem is, that when I want to select the data from my historical cube and want to write it back to a temp table or combine it with new data, the full Memory of 64gb in HANA is blocked for this Operation before the workflow breaks completely….

 

The original database of 14mio rows is just 600mb on HANA and I cannot see any reason on why it is blown up to more than 40gb (24gb are already blocked before) temporarily during this Operation. At the Moment I cannot see any suitable way on how to deal wih this issue and if I can make any Settings on HANA or ALteryx side to avoid it....

Has anyone dealt with a similar Problem and can help me find a solution?
Any helpful advice appreciated! Thanks in advance!

 

Best regards,
Christian

 

Key Facts:


SAP HANA:
- 64gb tenant
- 24gb blocked before workflow runs
- historical cube approx. 600mb on HANA before workflow runs

 

User:
- Access to just one Schema
- read and write Access on HANA tenant

8 REPLIES 8
JoeS
Alteryx
Alteryx

Hi @ChristianStaerk 

 

Is the join a 1-2-1 join? Or could you be duplicating records or even creating a Cartesian join between the two tables?

ChristianStaerk
6 - Meteoroid

Hi @JoeS ,

 

yes, it is a left outer in-db join. So nothing special...

 

JoeS
Alteryx
Alteryx

Are both sides unique?

 

I'd be tempted (even though it may take more time) to run it using the standard tools locally so I have more visibility of the data and go from there.

geraldo
13 - Pulsar

Hi,

 

You have tried replacing from temporary table to to a physical table. I've had a lot of problems with popping out temporary space due to the way Alteryx works "With To"; I changed my processes to physical table and at the end of the process I delete them all.

 

[]

ChristianStaerk
6 - Meteoroid

The right side is just one row and joined via a left join. So for this join it just takes less than one second to proceed.

The issue really is the table to be written back to HANA at the end.

ChristianStaerk
6 - Meteoroid

Yes @geraldo , unfortunately it is the same with writing back to a permanent table .... Have tried this as well.

geraldo
13 - Pulsar

Hi,

 

You can send a workflow print screen to view this join.

 

[]

ChristianStaerk
6 - Meteoroid
Hi everyone, thanks for your Input! It seems like we have found the solution, which seems quite simple, but is also Pretty annoying… I did a Left join with a table on the left including About 15mio rows with data for several periods (years 2016, 2017, 2018, 2019) and a right Input consisting of one row for a specific year (e.g. 2019). As I did the join on the year, there was just a small part in the Right columns, that had data in it. It seems like NULL() cells cause a Major Memory Problem. I solved the issue by doing a join on columns including just a single value like '1' to have data in all cells on the Right. This seemed to work as the memory reserved for the temp table was just blown up to approx 3gb instead of more than 40gb. Best, Christian
Labels