Dynamically Select SQL server based on a variable & dynamically change SQL query
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
Hello everyone,
I am struggling to solve the following situation:
- I am trying to create the Alteryx environment variables (DEV, PROD, UAT) while bringing in an OLE DB SQL connection using a dynamic input tool.
- The Alteryx environment variables (DEV, PROD, UAT) need to update automatically in the SQL connection the Workstation ID based on the environment
- Also, the SQL query itself is using an automatic field that updates the "ingestion_date" in the query based on a date given in the input.
I tried to put everything in a workflow (PFA) but I am having an error: "Error opening connect string. Can't create data source object."
I have noticed that not all the information previous written in the query is being outputted, so I think this is the issue, but I don´t know how to solve it.
Any help or hints that you might have for me?
Please note that the attached WF has some fake info just for testing and confidentiality reasons.
Thank you in advance,
Denisa
Solved! Go to Solution.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
The first problem is the size of the variable SQLQuery, which is a V_String 2000, is truncating your query.
Change it to a higher value> in the value of 2000 your query is creating like this:
Since you use an interface tool for date couldn't the environment also be the environment?
I'll analyze the connection and I'll answer you
SQLQuery
select 'LOAN_IQ' as nx_appli_emettrice,
a.RAF as chartfield2,
Cast(DatePart(year, a.accounting_dt) as varchar(4)) + Right('00' + Cast(DatePart(month, a.accounting_dt) as varchar(2)), 2) as monthid,
a.LOAN_NUMBER as nx_contrat,
a.posted_date,
a.accounting_dt,
Abs(a.Transaction_Amount) as monetary_amount,
a.currency_cd,
a.product,
a.business_line,
a.operating_unit,
a.nx_desk,
a.long_name,
a.CURRENT_OFFICER_NAME_LOAN nx_varchar_20_3,
a.nx_ref_book,
case
when a.Transaction_Description_I is NULL then a.Transaction_Description_P
else a.Transaction_Description_I
end as nx_varchar_10_1,
case
when a.Transaction_Description_I is NULL then a.Transaction_Description_P
else a.Transaction_Description_I
end as nx_varchar_10_2
from
(
select cf.fac_num_fac_cntl,
cf.cfl_effective_dt ost_dte_effective,
cf.cfl_cde_tran_type Transaction_Code,
cf.cfl_amt_cashflow Transaction_Amount,
cf.cfl_amt_cashflow,
cf.cfl_dsc_tran_type Transaction_Description_P,
cf.cfl_dsc_tran_type Transaction_Description_I,
Substring(ofc.dea_cde_expense, 3, Len(LTrim(RTrim(ofc.dea_cde_expense))) - 2) as nx_desk,
case
when teth.legal_name is NULL then cf.cfl_borrower_short_nme
else teth.legal_name
end as long_name,
cf.cfl_dsc_bus_tran as nx_ref_book,
cf.cfl_effective_dt posted_date,
cf.cfl_cde_currency currency_cd,
mba.business_line,
ofc.dea_cde_expense operating_unit,
cf.cfl_effective_dt accounting_dt,
ofc.ost_borrow_raf RAF,
ofc.ost_nme_alias LOAN_NUMBER,
fac1.dea_uid_rel_managr_nme CURRENT_OFFICER_NAME_LOAN,
NULL SERVICING_UNIT_CODE_loan,
Cast(cf.cfl_trans_release_dte as date) as eff_date,
case
when fac1.fac_cde_fac_type in ('REV', 'BORBS', 'VATRV', 'SPOTL', 'MIXNC', 'TRANC') then 'CIB Banking - Revolving credit facilities'
when fac1.fac_cde_fac_type in ('CAPEX', 'VATTR') then 'CIB Banking - Secured loans & asset financing'
when fac1.fac_cde_fac_type in ('TERM', 'ADF', 'DSCTB', 'LITIG', 'UCOVD', 'MISMA'
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
You are in the right track, here some suggestions:
- as indicated by @geraldo you better change your data types to V_WString to accomodate for very large strings.
- I will simplify your logic, instead of [User_DEV], [User_UAT] and [User_PROD] I suggest you to replace them with [TargetEnv], this approach simplify building your SQLConn.
- The introduction of [TargetEnv] simplified building [SQLConn] I added the variable [SQLConnNew].
- I strongly suggest to split your issue in two:
- Connectivity
- Retrieving the data.
Connectivity:
Focus on connecting to the environment and retrieving something simple for the DEV, UAT, PROD environment, something like "SELECT LOAN_NUMBER from ? ", I know the loan number is returned by your huge query, but I can't figure out from which table; the idea, is for the connectivity just try a very simple query.
Retrieving the data:
Once you complete the connectivity, and you are able to connect to the 3 environment, it is time to try the large query; I hope you already tested the large query with an SQL clients, so you know your query run fine, still, I will try a query returning an small number of row.
Hope this helps,
Arnaldo
Let us know how did you go.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
Hello,
Thank you both for the answer. I tried changing the data types to V_WString, but this is not solving my issue as it is still truncating the value.
Regarding the query I have to keep it the entire query - I am not sure if you paid attention, but this has a field that is automatically updating the query based on the parameter date. The big query works perfectly in a normal way used in "Dynamic Input" or a normal input, even in our database.
Also, as the query is big and is updating itself based on "business day" field, the working machine has to be updated based on the environment within the same dynamic input, so maybe my idea of working around the things isn´t good enough.
Maybe there is another way to do it using interactive tools? Do you have any example with this?
Best regards,
Denisa
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
Yes, I am aware you are building a Dynamic query, what I am suggesting is a proove of concept, spliting the problem in 2 steps; the first one focus on prooving connectivity to the DEV/UAT/PROD environment, you do not need the full query, just try with a very small one, a SELECT field FROM table, just to proove you are connecting, once this is done, swith to the large query; right now it is not easy to tell the break point of your workflow, connecting or running the query?
Arnaldo
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
give them different names in manage In-DB - use Dynamic Input In-DB and send in the name you want as the connection. You can use a dropdown/selector or whatever to choose which enviroment you are running on.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
Hello @apathetichell. I am not sure I understood your solution. Do you have any WF as example?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
Again just informing you, you don't need to change the type of variable you have to increase the numeric value from 2000 to another value. Your query alone has 6720 characters
Go to the View->Interdace Designer menu option and activate the browse tool for you as your query is 2000 characters long
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
Hello @ArnaldoSandoval ,
I tried again to run as instructed by you, but I get the same error: "Error opening connect string:v Can´t create Data Source Object".
Although I have the field "SQLConnnew" as V_WString and a big size it is still cuts the data of the cell. Please see picture attached.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
@geraldo all fields that create the final formula have V_WString already and a size of 1073741823 each one of them. I think the issue is related to the big size of the query that is not outputted correctly. Also, I have read on another similar situation and it looks that the code used in the dynamic input shouldn´t be too sized as well, if not is going to fail. On my side unfortunately, I have to use all that big query.
