As mentioned in the question the data is of Array[org.apache.spark.sql.Row] with only one element in each Row. so the simplest solution would be
scala> arr.map(x => x(0))
//res1: Array[Any] = Array(conversionevents, elements, pageviews, productviews)
I would to convert this data to string to avoid the Square brackets []
scala> arr.map(x => x(0).toString)
//res2: Array[String] = Array(conversionevents, elements, pageviews, productviews)
But if you have data as
//arr: Array[org.apache.spark.sql.Row] = Array([conversionevents,test1], [elements], [pageviews,test21,test22], [productviews])
above solution would reject rest of the values as
val result = arr.map(x => x(0))
//result: Array[Any] = Array(conversionevents, elements, pageviews, productviews)
the final solution is to use flatMap and toSeq as
val result = arr.flatMap(x => x.toSeq)
//result: Array[Any] = Array(conversionevents, test1, elements, pageviews, test21, test22, productviews)
and of course if you want them in String you can always use toString as
val result = arr.flatMap(x => x.toSeq.map(_.toString))
//result: Array[String] = Array(conversionevents, test1, elements, pageviews, test21, test22, productviews)
I hope the answer is helpful
Row.toString. You could extract the first element of each row to get the string out of it. I'm not sure what you plan for rows with multiple entries.arr(0)the brackets will be included even for a single element. That means you need to look at the documentation of Row. On the top of that page is also an example how to use the Row interface with Scala.