You can use the pandas library if the file is note very large. Be careful when using iter over pandas dataframes. I am doing this here to demonstrate the possibility. One could also consider the pd.Dataframe.to_sql() function when copying from a dataframe to an sql table
Assuming you have created the table you want you could:
import psycopg2
import pandas as pd
data=pd.read_csv(r'path\to\file.csv', delimiter=' ')
#prepare your data and keep only relevant columns
data.drop(['col2', 'col4','col5'], axis=1, inplace=True)
data.dropna(inplace=True)
print(data.iloc[:3])
conn=psycopg2.connect("dbname=db user=postgres password=password")
cur=conn.cursor()
for index,row in data.iterrows():
cur.execute('''insert into table (col1,col3,col6)
VALUES (%s,%s,%s)''', (row['col1'], row['col3'], row['col6'])
cur.close()
conn.commit()
conn.close()
print('\n db connection closed.')