Amazon Simple Storage Service
开发人员指南 (API Version 2006-03-01)
AWS 服务或AWS文档中描述的功能,可能因地区/位置而异。请点击 Amazon AWS 入门,可查看中国地区的具体差异

上传文件

以下任务将引导您使用高级别的 .NET 类来上传文件。API 提供了 Upload 方法的多个变体 (重载),使您可以轻松地上传数据。

高级别 API 文件上传过程

1

通过提供 AWS 凭证创建 TransferUtility 类的实例。

2

根据您是从文件、流还是从目录上传数据,执行任意一个 TransferUtility.Upload 重载。

以下 C# 代码示例演示了上述任务。

Copy
TransferUtility utility = new TransferUtility(); utility.Upload(filePath, existingBucketName);

使用 .NET API 上传大型文件时,即使数据已写入请求流,仍然可能会发生超时。您可以使用 TransferUtilityConfig.DefaultTimeout 设置显式的超时,如下面的 C# 代码示例所示。

Copy
TransferUtilityConfig config = new TransferUtilityConfig(); config.DefaultTimeout = 11111; TransferUtility utility = new TransferUtility(config);

以下 C# 代码示例将文件上传到 Amazon S3 存储桶。该示例演示了如何使用各种 TransferUtility.Upload 重载上传文件;每个后续的上传调用都将替换以前的上传。有关如何创建和测试有效示例的说明,请参阅 运行 Amazon S3 .NET 代码示例

Copy
using System; using System.IO; using Amazon.S3; using Amazon.S3.Transfer; namespace s3.amazon.com.docsamples { class UploadFileMPUHighLevelAPI { static string existingBucketName = "*** Provide bucket name ***"; static string keyName = "*** Provide your object key ***"; static string filePath = "*** Provide file name ***"; static void Main(string[] args) { try { TransferUtility fileTransferUtility = new TransferUtility(new AmazonS3Client(Amazon.RegionEndpoint.USEast1)); // 1. Upload a file, file name is used as the object key name. fileTransferUtility.Upload(filePath, existingBucketName); Console.WriteLine("Upload 1 completed"); // 2. Specify object key name explicitly. fileTransferUtility.Upload(filePath, existingBucketName, keyName); Console.WriteLine("Upload 2 completed"); // 3. Upload data from a type of System.IO.Stream. using (FileStream fileToUpload = new FileStream(filePath, FileMode.Open, FileAccess.Read)) { fileTransferUtility.Upload(fileToUpload, existingBucketName, keyName); } Console.WriteLine("Upload 3 completed"); // 4.Specify advanced settings/options. TransferUtilityUploadRequest fileTransferUtilityRequest = new TransferUtilityUploadRequest { BucketName = existingBucketName, FilePath = filePath, StorageClass = S3StorageClass.ReducedRedundancy, PartSize = 6291456, // 6 MB. Key = keyName, CannedACL = S3CannedACL.PublicRead }; fileTransferUtilityRequest.Metadata.Add("param1", "Value1"); fileTransferUtilityRequest.Metadata.Add("param2", "Value2"); fileTransferUtility.Upload(fileTransferUtilityRequest); Console.WriteLine("Upload 4 completed"); } catch (AmazonS3Exception s3Exception) { Console.WriteLine(s3Exception.Message, s3Exception.InnerException); } } } }