Tensorflow API: Tensor Transformations

本文介绍了TensorFlow中数据转换的方法,包括字符串到数字的转换、数据类型的转换、形状和重塑操作、切片与合并等高级技巧。这些操作对于处理复杂的机器学习任务至关重要。
部署运行你感兴趣的模型镜像

Tensor Transformations

1. Tensor Transformations
tf.string_to_number(string_tensor, out_type=None, name=None) # 字符串转为数字
tf.to_double(x, name=’ToDouble’)  # 转为64位浮点类型 float64
tf.to_float(x, name=’ToFloat’)  # 转为32位浮点类型 float32
tf.to_int32(x, name=’ToInt32’)  # 转为32位整型 int32
tf.to_int64(x, name=’ToInt64’)  # 转为64位整型 int64
tf.cast(x, dtype, name=None)    # 将x或者x.values转换为dtype
2. shape and reshape
# 't' is [[[1, 1, 1], [2, 2, 2]], [[3, 3, 3], [4, 4, 4]]]
tf.shape(input, name=None) # 返回数据的shape  ==> [2, 2, 3]
tf.size(input, name=None) # 返回数据的元素数量 ==> 12
tf.rank(input, name=None) # 返回tensor的rank(不同于矩阵的 rank) ==> 3
tf.reshape(tensor, shape, name=None) # 改变tensor的形状, 可使用'-1'自动推导
tf.expand_dims(input,axis=None,name=None,dim=None) # 维度扩展,dim表示扩展的维度
# 't' is a tensor of shape [2]
# shape(expand_dims(t, 0)) ==> [1, 2]
# shape(expand_dims(t, 1)) ==> [2, 1]
# shape(expand_dims(t, -1)) ==> [2, 1] 
3. Slicing and Joining
tf.slice(input_, begin, size, name=None)   # 对tensor进行切片操作
tf.split(value, num_or_size_splits, axis=0, num=None,name='split')  # 沿某一维度将tensor分离为num_split tensors
tf.concat(concat_dim, values, name=’concat’)  # 沿着某一维度连结tensor
tf.pack(values, axis=0, name=’pack’) # 将一系列rank-R的tensor打包为一个rank-(R+1)的tensor
tf.reverse(tensor, dims, name=None)  # 沿着某维度进行序列反转,其中dim为列表, 元素为bool型,size等于rank(tensor)
tf.transpose(a, perm=None, name=’transpose’) # 调换tensor的维度顺序, 按照列表perm的维度排列调换tensor顺序,
tf.gather(params, indices, validate_indices=None, name=None) # 合并索引indices所指示params中的切片
tf.one_hot(indices, depth, on_value=None, off_value=None, axis=None, dtype=None, name=None) indices = [0, 2, -1, 1]  # ont-hot encoding

您可能感兴趣的与本文相关的镜像

TensorFlow-v2.15

TensorFlow-v2.15

TensorFlow

TensorFlow 是由Google Brain 团队开发的开源机器学习框架,广泛应用于深度学习研究和生产环境。 它提供了一个灵活的平台,用于构建和训练各种机器学习模型

TypeError Traceback (most recent call last) Cell In[6], line 3 1 import numpy as np 2 import pandas as pd ----> 3 import tensorflow as tf 4 import matplotlib.pyplot as plt 5 from sklearn.preprocessing import StandardScaler File ~\anaconda3\envs\py39\lib\site-packages\tensorflow\__init__.py:37 34 import sys as _sys 35 import typing as _typing ---> 37 from tensorflow.python.tools import module_util as _module_util 38 from tensorflow.python.util.lazy_loader import LazyLoader as _LazyLoader 40 # Make sure code inside the TensorFlow codebase can use tf2.enabled() at import. File ~\anaconda3\envs\py39\lib\site-packages\tensorflow\python\__init__.py:42 37 from tensorflow.python.eager import context 39 # pylint: enable=wildcard-import 40 41 # Bring in subpackages. ---> 42 from tensorflow.python import data 43 from tensorflow.python import distribute 44 # from tensorflow.python import keras File ~\anaconda3\envs\py39\lib\site-packages\tensorflow\python\data\__init__.py:21 15 """`tf.data.Dataset` API for input pipelines. 16 17 See [Importing Data](https://tensorflow.org/guide/data) for an overview. 18 """ 20 # pylint: disable=unused-import ---> 21 from tensorflow.python.data import experimental 22 from tensorflow.python.data.ops.dataset_ops import AUTOTUNE 23 from tensorflow.python.data.ops.dataset_ops import Dataset File ~\anaconda3\envs\py39\lib\site-packages\tensorflow\python\data\experimental\__init__.py:96 15 """Experimental API for building input pipelines. 16 17 This module contains experimental `Dataset` sources and transformations that can (...) 92 @@UNKNOWN_CARDINALITY 93 """ 95 # pylint: disable=unused-import ---> 96 from tensorflow.python.data.experimental import service 97 from tensorflow.python.data.experimental.ops.batching import dense_to_ragged_batch 98 from tensorflow.python.data.experimental.ops.batching import dense_to_sparse_batch File ~\anaconda3\envs\py39\lib\site-packages\tensorflow\python\data\experimental\service\__init__.py:419 1 # Copyright 2020 The TensorFlow Authors. All Rights Reserved. 2 # 3 # Licensed under the Apache License, Version 2.0 (the "License"); (...) 13 # limitations under the License. 14 # ============================================================================== 15 """API for using the tf.data service. 16 17 This module contains: (...) 416 job of ParameterServerStrategy). 417 """ --> 419 from tensorflow.python.data.experimental.ops.data_service_ops import distribute 420 from tensorflow.python.data.experimental.ops.data_service_ops import from_dataset_id 421 from tensorflow.python.data.experimental.ops.data_service_ops import register_dataset File ~\anaconda3\envs\py39\lib\site-packages\tensorflow\python\data\experimental\ops\data_service_ops.py:24 22 from tensorflow.python import tf2 23 from tensorflow.python.compat import compat ---> 24 from tensorflow.python.data.experimental.ops import compression_ops 25 from tensorflow.python.data.experimental.service import _pywrap_server_lib 26 from tensorflow.python.data.experimental.service import _pywrap_utils File ~\anaconda3\envs\py39\lib\site-packages\tensorflow\python\data\experimental\ops\compression_ops.py:16 1 # Copyright 2020 The TensorFlow Authors. All Rights Reserved. 2 # 3 # Licensed under the Apache License, Version 2.0 (the "License"); (...) 13 # limitations under the License. 14 # ============================================================================== 15 """Ops for compressing and uncompressing dataset elements.""" ---> 16 from tensorflow.python.data.util import structure 17 from tensorflow.python.ops import gen_experimental_dataset_ops as ged_ops 20 def compress(element): File ~\anaconda3\envs\py39\lib\site-packages\tensorflow\python\data\util\structure.py:23 20 import six 21 import wrapt ---> 23 from tensorflow.python.data.util import nest 24 from tensorflow.python.framework import composite_tensor 25 from tensorflow.python.framework import ops File ~\anaconda3\envs\py39\lib\site-packages\tensorflow\python\data\util\nest.py:36 16 """## Functions for working with arbitrarily nested sequences of elements. 17 18 NOTE(mrry): This fork of the `tensorflow.python.util.nest` module (...) 31 arrays. 32 """ 34 import six as _six ---> 36 from tensorflow.python.framework import sparse_tensor as _sparse_tensor 37 from tensorflow.python.util import _pywrap_utils 38 from tensorflow.python.util import nest File ~\anaconda3\envs\py39\lib\site-packages\tensorflow\python\framework\sparse_tensor.py:24 22 from tensorflow.python import tf2 23 from tensorflow.python.framework import composite_tensor ---> 24 from tensorflow.python.framework import constant_op 25 from tensorflow.python.framework import dtypes 26 from tensorflow.python.framework import ops File ~\anaconda3\envs\py39\lib\site-packages\tensorflow\python\framework\constant_op.py:25 23 from tensorflow.core.framework import types_pb2 24 from tensorflow.python.eager import context ---> 25 from tensorflow.python.eager import execute 26 from tensorflow.python.framework import dtypes 27 from tensorflow.python.framework import op_callbacks File ~\anaconda3\envs\py39\lib\site-packages\tensorflow\python\eager\execute.py:23 21 from tensorflow.python import pywrap_tfe 22 from tensorflow.python.eager import core ---> 23 from tensorflow.python.framework import dtypes 24 from tensorflow.python.framework import ops 25 from tensorflow.python.framework import tensor_shape File ~\anaconda3\envs\py39\lib\site-packages\tensorflow\python\framework\dtypes.py:34 31 from tensorflow.python.types import trace 32 from tensorflow.core.function import trace_type ---> 34 _np_bfloat16 = _pywrap_bfloat16.TF_bfloat16_type() 37 class DTypeMeta(type(_dtypes.DType), abc.ABCMeta): 38 pass TypeError: Unable to convert function return value to a Python type! The signature was () -> handle
10-16
评论
成就一亿技术人!
拼手气红包6.0元
还能输入1000个字符
 
红包 添加红包
表情包 插入表情
 条评论被折叠 查看
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值