Improving sqlalchemy inserting performance for large set of data

# using add()
for d in data:
    fingerprint0 = tbModel(id=d)
    session.add(fingerprint0)
session.commit()

# using bulk_save_objects
dbs.bulk_save_objects([tbModel(id = d) for d in data])
dbs.commit()


# using bulk_insert_mappings
session.bulk_insert_mappings(tbModel,[{"id": d,} for d in data])
  

# using core
def run(self):
session.execute(tbModel.__table__.insert(),[{"id": d, } for d in data])
session.commit()

in these four way, the speed

core > bulk_insert_mappings > bulk_save_objects >>>>> add()

add is slower than others for almost 10 times

Just not use add when inserting large amount of data.