The solution proposed above by Sai Kiriti Badam worked for me.
I'm using Azure Databricks to read data captured from an EventHub. This contains a string column named EnqueuedTimeUtc with the following format...
12/7/2018 12:54:13 PM
I'm using a Python notebook and used the following...
import pyspark.sql.functions as func
sports_messages = sports_df.withColumn("EnqueuedTimestamp", func.to_timestamp("EnqueuedTimeUtc", "MM/dd/yyyy hh:mm:ss aaa"))
... to create a new column EnqueuedTimestamp of type "timestamp" with data in the following format...
2018-12-07 12:54:13