添加链接
link管理
链接快照平台
  • 输入网页链接,自动生成快照
  • 标签化管理网页链接

You can pass parameters/arguments to your SQL statements by programmatically creating the SQL string using Scala/Python and pass it to sqlContext.sql(string).

Here's an example using String formatting in Scala:

val param = 100

sqlContext.sql(s"""SELECT * FROM table1 where param=$param""")

Note the 's' in front of the first """. This lets you substitute $param's in a Scala string.

Here's an example using String formatting in Python:

param = 100

query = "SELECT * FROM table1 where param={}".format(param)

sqlContext.sql(query)

You can pass parameters/arguments to your SQL statements by programmatically creating the SQL string using Scala/Python and pass it to sqlContext.sql(string).

Here's an example using String formatting in Scala:

val param = 100

sqlContext.sql(s"""SELECT * FROM table1 where param=$param""")

Note the 's' in front of the first """. This lets you substitute $param's in a Scala string.

Here's an example using String formatting in Python:

param = 100

query = "SELECT * FROM table1 where param={}".format(param)

sqlContext.sql(query)

I tried the same with insert query like below

val a = 1207 val b = "amit" val c = 20 sqlContext.sql(s"""insert into table employee select t.* from (select $a, $b, $c) t""")

But, I got the following error

Exception in thread "main" org.apache.spark.sql.AnalysisException: cannot resolve 'amit' given input columns ;

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections.

Click here to register and join today!

Engage in exciting technical discussions , join a group with your peers and meet our Featured Members.

How to send Task Specific Parameters using run-now REST API. in Data Engineering File information is not passed to trigger job on file arrival in Data Engineering How to set the ABFSS URL for Azure Databricks Init Scripts that have spaces in directory names? in Data Engineering © Databricks 2023. All rights reserved. Apache, Apache Spark, Spark and the Spark logo are trademarks of the Apache Software Foundation.
  • Privacy Notice
  • Terms of Use
  • Your Privacy Choices
  • Your California Privacy Rights
  •