Using the Amazon Simple Storage Service (S3), uploads data to an Amazon S3 bucket.
This transformer accepts any feature.
Features which have successfully uploaded their contents to S3 are output through this port.
Features which have not successfully uploaded their contents to S3 are output through this port. Additionally, a message stored in an attribute specified in Error Attribute, will contain details about the failure.
The name of the Amazon S3 bucket.
The AWS Region in which the specified Bucket resides. If the default value, US East, N. Virginia (us-east-1), is specified, and the specified Bucket does not reside there, the operation will still succeed. However, to optimize latency, it is best practice to specify the correct region.
Specify a web connection to Amazon S3. Web connections can be reused in multiple workspaces, and connection parameters are hidden in the workspace. For more information, see Using Web Connections. Alternatively, check Embed Access Key (below) and specify the connection parameters manually.
When checked, you must specify the connection parameters to Amazon S3 manually. The connection parameters are visible in the workspace. To connect, specify:
An access key associated with a user with permission to upload data to the specified bucket. If a pairing of an access key and a secret access key is not given, credentials will be searched in various locations listed here.
If credentials are still not found, the client will act as if it is in anonymous mode, where requests aren’t signed. This is useful if accessing a publicly accessible object or bucket.
A secret key paired with the access key provided. See the Access Key ID parameter for credential searching.
The name that the uploaded data will be stored under. Bucket Name and Object Key together uniquely identify an object, and should vary for each feature processed.
If Yes, allows Amazon S3 Transfer Acceleration on the bucket, if enabled. To enable acceleration, see http://docs.aws.amazon.com/AmazonS3/latest/UG/enable-bucket-transfer-acceleration.html.
When allowing S3 Acceleration, keep in mind the following:
The source of the data to be uploaded.
When working with large objects, File is the better choice as the data will be streamed directly from disk and not require that the object be stored entirely in memory on a feature.
The file to be uploaded, when Data Source is File.
The data to be uploaded, when Data Source is Attribute or Expression. An attribute or expression can be used to supply the data.
The predefined set of grantees and permissions to store with each uploaded object. For more information, see http://docs.aws.amazon.com/AmazonS3/latest/dev/ACLOverview.html#CannedACL.
You can specify header fields in the HTTP upload request. Under Name, specify a header field. You can select from a drop-down list of predefined fields that include metadata and ACL permissions parameters. Alternatively, enter fields manually. Under Value, specify a field value.
Note: If CannedACL or Content-Type headers are specified here, they override Permissions and Upload Content Typesettings, respectively.
The folder to be uploaded, when Data Source is Folder.
Option to include or exclude subfolders of the given folder from the upload.
Specify the output attribute that will store the uploaded object’s URI.
Specify a list name for the list attribute that will contain all headers entered in Headers.
Using a set of menu options, transformer parameters can be assigned by referencing other elements in the workspace. More advanced functions, such as an advanced editor and an arithmetic editor, are also available in some transformers. To access a menu of these options, click beside the applicable parameter. For more information, see Transformer Parameter Menu Options.
FME Professional edition and above
Associated FME function or factory: S3Factory
Search for samples and information about this transformer on the FME Knowledge Center.