pyspark.sql.functions.
element_at
Collection function: Returns element of array at given index in extraction if col is array. Returns value for the given key in extraction if col is map.
New in version 2.4.0.
Column
name of column containing array or map
index to check for in array or key to check for in map
Notes
The position is not zero based, but 1 based index.
Examples
>>> df = spark.createDataFrame([(["a", "b", "c"],), ([],)], ['data']) >>> df.select(element_at(df.data, 1)).collect() [Row(element_at(data, 1)='a'), Row(element_at(data, 1)=None)]
>>> df = spark.createDataFrame([({"a": 1.0, "b": 2.0},), ({},)], ['data']) >>> df.select(element_at(df.data, lit("a"))).collect() [Row(element_at(data, a)=1.0), Row(element_at(data, a)=None)]