Move constraints from Transform stages to input stage WHERE clauses, if possible, to reduce the number of rows the job has to process.
Adjust the rows per transaction setting. Try 1000, 5000, or 10,000.
Adjust the array size setting. Try 10, 100, or 1000.
If output rows are INSERTs or APPENDs, not UPDATEs, consider using a native bulk loader. Direct output to a sequential file compatible with the bulk loader, then invoke the bulk loader using an after-job subroutine. The bulk loader for Oracle is SQLLDR.
Consider moving reference lookups to a join within the input stage. All columns used to join the tables should be indexed to maximize performance.
If the number of rows in a hashed file is small, consider pre-loading the file into memory by checking the pre-load into memory checkbox in the Hashed File stage.