python-拷贝图片/移除图片

该代码示例展示了如何从训练集文件夹(trainA/B/C)中随机选择494张图片,并将这些图片移动到测试集文件夹(testA/B/C)中,同时删除原训练集中的对应图片。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

import numpy as np
import shutil
import os
root = '/root/……/datasets/face_newData'
trainA_path = root + '/trainA'
trainB_path = root + '/trainB'
trainC_path = root + '/trainC'

testA_path = root + '/testA'
testB_path = root + '/testB'
testC_path = root + '/testC'

ID = []
for i in range(494):
    idx = np.random.randint(10494)
    print idx
    if idx not in ID:
        ID.append(idx)
        A_path = trainA_path +  '/{}_A.jpg'.format(idx)
        A_new_path = testA_path + '/{}_A.jpg'.format(idx)
        shutil.copy(A_path, A_new_path)
        os.remove(A_path)


        B_path = trainB_path + '/{}_B.jpg'.format(idx)
        B_new_path = testB_path + '/{}_B.jpg'.format(idx)
        shutil.copy(B_path, B_new_path)
        os.remove(B_path)

        C_path = trainC_path + '/{}_C.jpg'.format(idx)
        C_new_path =testC_path + '/{}_C.jpg'.format(idx)
        shutil.copy(C_path, C_new_path)
        os.remove(C_path)
    else:
        continue
以上代码主要是从trainA/B/C中随机选取494张图片copy到testA/B/C中
Info: Sourcing environment configuration script /opt/module/flume/conf/flume-env.sh Info: Including Hadoop libraries found via (/opt/module/hadoop-3.3.1/bin/hadoop) for HDFS access Info: Including Hive libraries found via (/opt/module/apache-hive-3.1.3-bin) for Hive access + exec /opt/module/jdk1.8.0_311/bin/java -Xmx20m -cp '/opt/module/flume/conf:/opt/module/flume/lib/*:/opt/module/hadoop-3.3.1/etc/hadoop:/opt/module/hadoop-3.3.1/share/hadoop/common/lib/*:/opt/module/hadoop-3.3.1/share/hadoop/common/*:/opt/module/hadoop-3.3.1/share/hadoop/hdfs:/opt/module/hadoop-3.3.1/share/hadoop/hdfs/lib/*:/opt/module/hadoop-3.3.1/share/hadoop/hdfs/*:/opt/module/hadoop-3.3.1/share/hadoop/mapreduce/*:/opt/module/hadoop-3.3.1/share/hadoop/yarn:/opt/module/hadoop-3.3.1/share/hadoop/yarn/lib/*:/opt/module/hadoop-3.3.1/share/hadoop/yarn/*:/opt/module/apache-hive-3.1.3-bin/lib/*' -Djava.library.path=:/opt/module/hadoop-3.3.1/lib/native org.apache.flume.node.Application -n a1 -f job/fast_food_kafka_to_hdfs_db.conf - Dflume.root.logger=INFO,console SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/opt/module/flume/lib/log4j-slf4j-impl-2.18.0.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/opt/module/hadoop-3.3.1/share/hadoop/common/lib/slf4j-log4j12-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/opt/module/apache-hive-3.1.3-bin/lib/log4j-slf4j-impl-2.17.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
最新发布
03-25
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值