You can pass parameters/arguments to your SQL statements by programmatically creating the SQL string using Scala/Python and pass it to sqlContext.sql(string).
Here's an example using String formatting in Scala:
val param = 100
sqlContext.sql(s"""SELECT * FROM table1 where param=$param""")
Note the 's' in front of the first """. This lets you substitute $param's in a Scala string.
Here's an example using String formatting in Python:
param = 100
query = "SELECT * FROM table1 where param={}".format(param)
sqlContext.sql(query)
You can pass parameters/arguments to your SQL statements by programmatically creating the SQL string using Scala/Python and pass it to sqlContext.sql(string).
Here's an example using String formatting in Scala:
val param = 100
sqlContext.sql(s"""SELECT * FROM table1 where param=$param""")
Note the 's' in front of the first """. This lets you substitute $param's in a Scala string.
Here's an example using String formatting in Python:
param = 100
query = "SELECT * FROM table1 where param={}".format(param)
sqlContext.sql(query)
I tried the same with insert query like below
val a = 1207 val b = "amit" val c = 20 sqlContext.sql(s"""insert into table employee select t.* from (select $a, $b, $c) t""")
But, I got the following error
Exception in thread "main" org.apache.spark.sql.AnalysisException: cannot resolve 'amit' given input columns ;
Welcome to Databricks Community: Lets learn, network and celebrate together
Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections.