问题:如何使用Boto将文件上传到S3存储桶中的目录

我想使用python在s3存储桶中复制文件。

例如:我的存储桶名称=测试。在存储桶中,我有2个文件夹名称为“ dump”和“ input”。现在,我想使用python将文件从本地目录复制到S3“转储”文件夹…有人可以帮助我吗?

I want to copy a file in s3 bucket using python.

Ex : I have bucket name = test. And in the bucket, I have 2 folders name “dump” & “input”. Now I want to copy a file from local directory to S3 “dump” folder using python… Can anyone help me?


回答 0

试试这个…

import boto
import boto.s3
import sys
from boto.s3.key import Key

AWS_ACCESS_KEY_ID = ''
AWS_SECRET_ACCESS_KEY = ''

bucket_name = AWS_ACCESS_KEY_ID.lower() + '-dump'
conn = boto.connect_s3(AWS_ACCESS_KEY_ID,
        AWS_SECRET_ACCESS_KEY)


bucket = conn.create_bucket(bucket_name,
    location=boto.s3.connection.Location.DEFAULT)

testfile = "replace this with an actual filename"
print 'Uploading %s to Amazon S3 bucket %s' % \
   (testfile, bucket_name)

def percent_cb(complete, total):
    sys.stdout.write('.')
    sys.stdout.flush()


k = Key(bucket)
k.key = 'my test file'
k.set_contents_from_filename(testfile,
    cb=percent_cb, num_cb=10)

[更新]我不是pythonist,所以感谢您对import语句的注意。另外,我不建议将凭据放入您自己的源代码中。如果您在AWS内部运行此代码,请使用带有实例配置文件(http://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_use_switch-role-ec2_instance-profiles.html)的IAM凭证,并在其中保留相同的行为。您的开发/测试环境,请使用类似AdRoll的Hologram(https://github.com/AdRoll/hologram

Try this…

import boto
import boto.s3
import sys
from boto.s3.key import Key

AWS_ACCESS_KEY_ID = ''
AWS_SECRET_ACCESS_KEY = ''

bucket_name = AWS_ACCESS_KEY_ID.lower() + '-dump'
conn = boto.connect_s3(AWS_ACCESS_KEY_ID,
        AWS_SECRET_ACCESS_KEY)


bucket = conn.create_bucket(bucket_name,
    location=boto.s3.connection.Location.DEFAULT)

testfile = "replace this with an actual filename"
print 'Uploading %s to Amazon S3 bucket %s' % \
   (testfile, bucket_name)

def percent_cb(complete, total):
    sys.stdout.write('.')
    sys.stdout.flush()


k = Key(bucket)
k.key = 'my test file'
k.set_contents_from_filename(testfile,
    cb=percent_cb, num_cb=10)

[UPDATE] I am not a pythonist, so thanks for the heads up about the import statements. Also, I’d not recommend placing credentials inside your own source code. If you are running this inside AWS use IAM Credentials with Instance Profiles (http://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_use_switch-role-ec2_instance-profiles.html), and to keep the same behaviour in your Dev/Test environment, use something like Hologram from AdRoll (https://github.com/AdRoll/hologram)


回答 1

无需使其变得如此复杂:

s3_connection = boto.connect_s3()
bucket = s3_connection.get_bucket('your bucket name')
key = boto.s3.key.Key(bucket, 'some_file.zip')
with open('some_file.zip') as f:
    key.send_file(f)

No need to make it that complicated:

s3_connection = boto.connect_s3()
bucket = s3_connection.get_bucket('your bucket name')
key = boto.s3.key.Key(bucket, 'some_file.zip')
with open('some_file.zip') as f:
    key.send_file(f)

回答 2

import boto3

s3 = boto3.resource('s3')
BUCKET = "test"

s3.Bucket(BUCKET).upload_file("your/local/file", "dump/file")
import boto3

s3 = boto3.resource('s3')
BUCKET = "test"

s3.Bucket(BUCKET).upload_file("your/local/file", "dump/file")

回答 3

我用了它,实现起来很简单

import tinys3

conn = tinys3.Connection('S3_ACCESS_KEY','S3_SECRET_KEY',tls=True)

f = open('some_file.zip','rb')
conn.upload('some_file.zip',f,'my_bucket')

https://www.smore.com/labs/tinys3/

I used this and it is very simple to implement

import tinys3

conn = tinys3.Connection('S3_ACCESS_KEY','S3_SECRET_KEY',tls=True)

f = open('some_file.zip','rb')
conn.upload('some_file.zip',f,'my_bucket')

https://www.smore.com/labs/tinys3/


回答 4

from boto3.s3.transfer import S3Transfer
import boto3
#have all the variables populated which are required below
client = boto3.client('s3', aws_access_key_id=access_key,aws_secret_access_key=secret_key)
transfer = S3Transfer(client)
transfer.upload_file(filepath, bucket_name, folder_name+"/"+filename)
from boto3.s3.transfer import S3Transfer
import boto3
#have all the variables populated which are required below
client = boto3.client('s3', aws_access_key_id=access_key,aws_secret_access_key=secret_key)
transfer = S3Transfer(client)
transfer.upload_file(filepath, bucket_name, folder_name+"/"+filename)

回答 5

在具有凭据的会话中将文件上传到s3。

import boto3

session = boto3.Session(
    aws_access_key_id='AWS_ACCESS_KEY_ID',
    aws_secret_access_key='AWS_SECRET_ACCESS_KEY',
)
s3 = session.resource('s3')
# Filename - File to upload
# Bucket - Bucket to upload to (the top level directory under AWS S3)
# Key - S3 object name (can contain subdirectories). If not specified then file_name is used
s3.meta.client.upload_file(Filename='input_file_path', Bucket='bucket_name', Key='s3_output_key')

Upload file to s3 within a session with credentials.

import boto3

session = boto3.Session(
    aws_access_key_id='AWS_ACCESS_KEY_ID',
    aws_secret_access_key='AWS_SECRET_ACCESS_KEY',
)
s3 = session.resource('s3')
# Filename - File to upload
# Bucket - Bucket to upload to (the top level directory under AWS S3)
# Key - S3 object name (can contain subdirectories). If not specified then file_name is used
s3.meta.client.upload_file(Filename='input_file_path', Bucket='bucket_name', Key='s3_output_key')

回答 6

这也将起作用:

import os 
import boto
import boto.s3.connection
from boto.s3.key import Key

try:

    conn = boto.s3.connect_to_region('us-east-1',
    aws_access_key_id = 'AWS-Access-Key',
    aws_secret_access_key = 'AWS-Secrete-Key',
    # host = 's3-website-us-east-1.amazonaws.com',
    # is_secure=True,               # uncomment if you are not using ssl
    calling_format = boto.s3.connection.OrdinaryCallingFormat(),
    )

    bucket = conn.get_bucket('YourBucketName')
    key_name = 'FileToUpload'
    path = 'images/holiday' #Directory Under which file should get upload
    full_key_name = os.path.join(path, key_name)
    k = bucket.new_key(full_key_name)
    k.set_contents_from_filename(key_name)

except Exception,e:
    print str(e)
    print "error"   

This will also work:

import os 
import boto
import boto.s3.connection
from boto.s3.key import Key

try:

    conn = boto.s3.connect_to_region('us-east-1',
    aws_access_key_id = 'AWS-Access-Key',
    aws_secret_access_key = 'AWS-Secrete-Key',
    # host = 's3-website-us-east-1.amazonaws.com',
    # is_secure=True,               # uncomment if you are not using ssl
    calling_format = boto.s3.connection.OrdinaryCallingFormat(),
    )

    bucket = conn.get_bucket('YourBucketName')
    key_name = 'FileToUpload'
    path = 'images/holiday' #Directory Under which file should get upload
    full_key_name = os.path.join(path, key_name)
    k = bucket.new_key(full_key_name)
    k.set_contents_from_filename(key_name)

except Exception,e:
    print str(e)
    print "error"   

回答 7

这是三班轮。只需按照boto3文档中的说明进行操作

import boto3
s3 = boto3.resource(service_name = 's3')
s3.meta.client.upload_file(Filename = 'C:/foo/bar/baz.filetype', Bucket = 'yourbucketname', Key = 'baz.filetype')

一些重要的论据是:

参数:

  • 文件名str)-要上传的文件的路径。
  • 存储桶str)-要上传到的存储桶的名称。
  • str)-您要分配给s3存储桶中文件的的名称。该名称可以与文件名相同,也可以与您选择的名称不同,但是文件类型应保持不变。

    注意:我假设您已按照boto3文档中最佳配置做法的~\.aws建议将凭据保存在文件夹中。

  • This is a three liner. Just follow the instructions on the boto3 documentation.

    import boto3
    s3 = boto3.resource(service_name = 's3')
    s3.meta.client.upload_file(Filename = 'C:/foo/bar/baz.filetype', Bucket = 'yourbucketname', Key = 'baz.filetype')
    

    Some important arguments are:

    Parameters:

  • Filename (str) — The path to the file to upload.
  • Bucket (str) — The name of the bucket to upload to.
  • Key (str) — The name of the that you want to assign to your file in your s3 bucket. This could be the same as the name of the file or a different name of your choice but the filetype should remain the same.

    Note: I assume that you have saved your credentials in a ~\.aws folder as suggested in the best configuration practices in the boto3 documentation.


  • 回答 8

    import boto
    from boto.s3.key import Key
    
    AWS_ACCESS_KEY_ID = ''
    AWS_SECRET_ACCESS_KEY = ''
    END_POINT = ''                          # eg. us-east-1
    S3_HOST = ''                            # eg. s3.us-east-1.amazonaws.com
    BUCKET_NAME = 'test'        
    FILENAME = 'upload.txt'                
    UPLOADED_FILENAME = 'dumps/upload.txt'
    # include folders in file path. If it doesn't exist, it will be created
    
    s3 = boto.s3.connect_to_region(END_POINT,
                               aws_access_key_id=AWS_ACCESS_KEY_ID,
                               aws_secret_access_key=AWS_SECRET_ACCESS_KEY,
                               host=S3_HOST)
    
    bucket = s3.get_bucket(BUCKET_NAME)
    k = Key(bucket)
    k.key = UPLOADED_FILENAME
    k.set_contents_from_filename(FILENAME)
    import boto
    from boto.s3.key import Key
    
    AWS_ACCESS_KEY_ID = ''
    AWS_SECRET_ACCESS_KEY = ''
    END_POINT = ''                          # eg. us-east-1
    S3_HOST = ''                            # eg. s3.us-east-1.amazonaws.com
    BUCKET_NAME = 'test'        
    FILENAME = 'upload.txt'                
    UPLOADED_FILENAME = 'dumps/upload.txt'
    # include folders in file path. If it doesn't exist, it will be created
    
    s3 = boto.s3.connect_to_region(END_POINT,
                               aws_access_key_id=AWS_ACCESS_KEY_ID,
                               aws_secret_access_key=AWS_SECRET_ACCESS_KEY,
                               host=S3_HOST)
    
    bucket = s3.get_bucket(BUCKET_NAME)
    k = Key(bucket)
    k.key = UPLOADED_FILENAME
    k.set_contents_from_filename(FILENAME)
    

    回答 9

    使用boto3

    import logging
    import boto3
    from botocore.exceptions import ClientError
    
    
    def upload_file(file_name, bucket, object_name=None):
        """Upload a file to an S3 bucket
    
        :param file_name: File to upload
        :param bucket: Bucket to upload to
        :param object_name: S3 object name. If not specified then file_name is used
        :return: True if file was uploaded, else False
        """
    
        # If S3 object_name was not specified, use file_name
        if object_name is None:
            object_name = file_name
    
        # Upload the file
        s3_client = boto3.client('s3')
        try:
            response = s3_client.upload_file(file_name, bucket, object_name)
        except ClientError as e:
            logging.error(e)
            return False
        return True

    有关更多信息:-https : //boto3.amazonaws.com/v1/documentation/api/latest/guide/s3-uploading-files.html

    Using boto3

    import logging
    import boto3
    from botocore.exceptions import ClientError
    
    
    def upload_file(file_name, bucket, object_name=None):
        """Upload a file to an S3 bucket
    
        :param file_name: File to upload
        :param bucket: Bucket to upload to
        :param object_name: S3 object name. If not specified then file_name is used
        :return: True if file was uploaded, else False
        """
    
        # If S3 object_name was not specified, use file_name
        if object_name is None:
            object_name = file_name
    
        # Upload the file
        s3_client = boto3.client('s3')
        try:
            response = s3_client.upload_file(file_name, bucket, object_name)
        except ClientError as e:
            logging.error(e)
            return False
        return True
    

    For more:- https://boto3.amazonaws.com/v1/documentation/api/latest/guide/s3-uploading-files.html


    回答 10

    对于上传文件夹示例,如下代码和S3文件夹图片 在此处输入图片说明

    import boto
    import boto.s3
    import boto.s3.connection
    import os.path
    import sys    
    
    # Fill in info on data to upload
    # destination bucket name
    bucket_name = 'willie20181121'
    # source directory
    sourceDir = '/home/willie/Desktop/x/'  #Linux Path
    # destination directory name (on s3)
    destDir = '/test1/'   #S3 Path
    
    #max size in bytes before uploading in parts. between 1 and 5 GB recommended
    MAX_SIZE = 20 * 1000 * 1000
    #size of parts when uploading in parts
    PART_SIZE = 6 * 1000 * 1000
    
    access_key = 'MPBVAQ*******IT****'
    secret_key = '11t63yDV***********HgUcgMOSN*****'
    
    conn = boto.connect_s3(
            aws_access_key_id = access_key,
            aws_secret_access_key = secret_key,
            host = '******.org.tw',
            is_secure=False,               # uncomment if you are not using ssl
            calling_format = boto.s3.connection.OrdinaryCallingFormat(),
            )
    bucket = conn.create_bucket(bucket_name,
            location=boto.s3.connection.Location.DEFAULT)
    
    
    uploadFileNames = []
    for (sourceDir, dirname, filename) in os.walk(sourceDir):
        uploadFileNames.extend(filename)
        break
    
    def percent_cb(complete, total):
        sys.stdout.write('.')
        sys.stdout.flush()
    
    for filename in uploadFileNames:
        sourcepath = os.path.join(sourceDir + filename)
        destpath = os.path.join(destDir, filename)
        print ('Uploading %s to Amazon S3 bucket %s' % \
               (sourcepath, bucket_name))
    
        filesize = os.path.getsize(sourcepath)
        if filesize > MAX_SIZE:
            print ("multipart upload")
            mp = bucket.initiate_multipart_upload(destpath)
            fp = open(sourcepath,'rb')
            fp_num = 0
            while (fp.tell() < filesize):
                fp_num += 1
                print ("uploading part %i" %fp_num)
                mp.upload_part_from_file(fp, fp_num, cb=percent_cb, num_cb=10, size=PART_SIZE)
    
            mp.complete_upload()
    
        else:
            print ("singlepart upload")
            k = boto.s3.key.Key(bucket)
            k.key = destpath
            k.set_contents_from_filename(sourcepath,
                    cb=percent_cb, num_cb=10)

    PS:有关更多参考URL

    For upload folder example as following code and S3 folder picture enter image description here

    import boto
    import boto.s3
    import boto.s3.connection
    import os.path
    import sys    
    
    # Fill in info on data to upload
    # destination bucket name
    bucket_name = 'willie20181121'
    # source directory
    sourceDir = '/home/willie/Desktop/x/'  #Linux Path
    # destination directory name (on s3)
    destDir = '/test1/'   #S3 Path
    
    #max size in bytes before uploading in parts. between 1 and 5 GB recommended
    MAX_SIZE = 20 * 1000 * 1000
    #size of parts when uploading in parts
    PART_SIZE = 6 * 1000 * 1000
    
    access_key = 'MPBVAQ*******IT****'
    secret_key = '11t63yDV***********HgUcgMOSN*****'
    
    conn = boto.connect_s3(
            aws_access_key_id = access_key,
            aws_secret_access_key = secret_key,
            host = '******.org.tw',
            is_secure=False,               # uncomment if you are not using ssl
            calling_format = boto.s3.connection.OrdinaryCallingFormat(),
            )
    bucket = conn.create_bucket(bucket_name,
            location=boto.s3.connection.Location.DEFAULT)
    
    
    uploadFileNames = []
    for (sourceDir, dirname, filename) in os.walk(sourceDir):
        uploadFileNames.extend(filename)
        break
    
    def percent_cb(complete, total):
        sys.stdout.write('.')
        sys.stdout.flush()
    
    for filename in uploadFileNames:
        sourcepath = os.path.join(sourceDir + filename)
        destpath = os.path.join(destDir, filename)
        print ('Uploading %s to Amazon S3 bucket %s' % \
               (sourcepath, bucket_name))
    
        filesize = os.path.getsize(sourcepath)
        if filesize > MAX_SIZE:
            print ("multipart upload")
            mp = bucket.initiate_multipart_upload(destpath)
            fp = open(sourcepath,'rb')
            fp_num = 0
            while (fp.tell() < filesize):
                fp_num += 1
                print ("uploading part %i" %fp_num)
                mp.upload_part_from_file(fp, fp_num, cb=percent_cb, num_cb=10, size=PART_SIZE)
    
            mp.complete_upload()
    
        else:
            print ("singlepart upload")
            k = boto.s3.key.Key(bucket)
            k.key = destpath
            k.set_contents_from_filename(sourcepath,
                    cb=percent_cb, num_cb=10)
    

    PS: For more reference URL


    回答 11

    xmlstr = etree.tostring(listings,  encoding='utf8', method='xml')
    conn = boto.connect_s3(
            aws_access_key_id = access_key,
            aws_secret_access_key = secret_key,
            # host = '<bucketName>.s3.amazonaws.com',
            host = 'bycket.s3.amazonaws.com',
            #is_secure=False,               # uncomment if you are not using ssl
            calling_format = boto.s3.connection.OrdinaryCallingFormat(),
            )
    conn.auth_region_name = 'us-west-1'
    
    bucket = conn.get_bucket('resources', validate=False)
    key= bucket.get_key('filename.txt')
    key.set_contents_from_string("SAMPLE TEXT")
    key.set_canned_acl('public-read')
    xmlstr = etree.tostring(listings,  encoding='utf8', method='xml')
    conn = boto.connect_s3(
            aws_access_key_id = access_key,
            aws_secret_access_key = secret_key,
            # host = '<bucketName>.s3.amazonaws.com',
            host = 'bycket.s3.amazonaws.com',
            #is_secure=False,               # uncomment if you are not using ssl
            calling_format = boto.s3.connection.OrdinaryCallingFormat(),
            )
    conn.auth_region_name = 'us-west-1'
    
    bucket = conn.get_bucket('resources', validate=False)
    key= bucket.get_key('filename.txt')
    key.set_contents_from_string("SAMPLE TEXT")
    key.set_canned_acl('public-read')
    

    回答 12

    我觉得有些东西还需要点命令:

    import boto3
    from pprint import pprint
    from botocore.exceptions import NoCredentialsError
    
    
    class S3(object):
        BUCKET = "test"
        connection = None
    
        def __init__(self):
            try:
                vars = get_s3_credentials("aws")
                self.connection = boto3.resource('s3', 'aws_access_key_id',
                                                 'aws_secret_access_key')
            except(Exception) as error:
                print(error)
                self.connection = None
    
    
        def upload_file(self, file_to_upload_path, file_name):
            if file_to_upload is None or file_name is None: return False
            try:
                pprint(file_to_upload)
                file_name = "your-folder-inside-s3/{0}".format(file_name)
                self.connection.Bucket(self.BUCKET).upload_file(file_to_upload_path, 
                                                                          file_name)
                print("Upload Successful")
                return True
    
            except FileNotFoundError:
                print("The file was not found")
                return False
    
            except NoCredentialsError:
                print("Credentials not available")
                return False
    
    

    这里有三个重要的变量,BUCKET const,file_to_uploadfile_name

    BUCKET:是您的S3存储桶的名称

    file_to_upload_path:必须是您要上传的文件的路径

    file_name:是存储桶中生成的文件和路径(这是您添加文件夹或其他内容的位置)

    有很多方法,但是您可以在这样的另一个脚本中重用此代码

    import S3
    
    def some_function():
        S3.S3().upload_file(path_to_file, final_file_name)

    I have something that seems to me has a bit more order:

    import boto3
    from pprint import pprint
    from botocore.exceptions import NoCredentialsError
    
    
    class S3(object):
        BUCKET = "test"
        connection = None
    
        def __init__(self):
            try:
                vars = get_s3_credentials("aws")
                self.connection = boto3.resource('s3', 'aws_access_key_id',
                                                 'aws_secret_access_key')
            except(Exception) as error:
                print(error)
                self.connection = None
    
    
        def upload_file(self, file_to_upload_path, file_name):
            if file_to_upload is None or file_name is None: return False
            try:
                pprint(file_to_upload)
                file_name = "your-folder-inside-s3/{0}".format(file_name)
                self.connection.Bucket(self.BUCKET).upload_file(file_to_upload_path, 
                                                                          file_name)
                print("Upload Successful")
                return True
    
            except FileNotFoundError:
                print("The file was not found")
                return False
    
            except NoCredentialsError:
                print("Credentials not available")
                return False
    
    
    

    There’re three important variables here, the BUCKET const, the file_to_upload and the file_name

    BUCKET: is the name of your S3 bucket

    file_to_upload_path: must be the path from file you want to upload

    file_name: is the resulting file and path in your bucket (this is where you add folders or what ever)

    There’s many ways but you can reuse this code in another script like this

    import S3
    
    def some_function():
        S3.S3().upload_file(path_to_file, final_file_name)
    

    声明:本站所有文章,如无特殊说明或标注,均为本站原创发布。任何个人或组织,在未征得本站同意时,禁止复制、盗用、采集、发布本站内容到任何网站、书籍等各类媒体平台。如若本站内容侵犯了原著者的合法权益,可联系我们进行处理。