Hi,
Is there any Step-by-step Example using Apache Spark Code Tool?
Thanks
Does this link help?
Harness the Power of Your Data Lake with Alteryx S... - Alteryx Community
Cheers!
@RobertOdera - two different things. That's for the Apache Spark In-DB connection. Spark Code Tool should allow you to interact with a Spark Job/SQL/Notebook API (ie Databricks) directly.
And the answer is - not that I'm aware of. Are you on AWS or Azure? (or Dataproc/GCP)? I tried it on my AWS Databricks and do not remember getting much.
Okay, got it @apathetichell
I tinkered in Azure a while ago but do not recall this specific use case.
But also, I am a lightweight in this, so my limitations are skewing my answer - sorry I wasn't helpful.
@RobertOdera we've all been there - and will be there again. TBH - I use Databricks on daily basis and I've played with this tool.... and I haven't gotten anywhere with it. I'm on AWS Databricks so it may behave differently on Azure.
From my findings, the solution still required coding knowledge in Spark. The earlier goal actually to see if Alteryx can replace the Spark coding. This still left the business user dependencies to IT/vendor
Um. Yes. the Apache Spark Code tool requires you to code in Spark. You can use other tools to replicate some of what you would on Spark (In-DB tools when connected to Databricks for example) - but your business user is going to be dependent upon someone for something if you are storing your data in Databricks/Apache Spark and hoping to use Spark functionality. Do you want to explain what your use-case is and what you are trying to accomplish?