serialization-Avro

本文深入探讨了Avro作为高效的序列化和反序列化工具的应用,包括其工作原理、代码实现及引入方式。通过具体示例展示了如何使用Avro创建User类并进行序列化与反序列化操作。
感受最快的序列化和反序列化工具:Avro
参考阅读:
[url]http://tech.meituan.com/serialization_vs_deserialization.html[/url]
官网:
[url]http://avro.apache.org/docs/current/gettingstartedjava.html#Creating+users[/url]
代码结构图:
[img]http://dl2.iteye.com/upload/attachment/0108/9625/bb06ca9d-3639-3b5b-9c11-d7b405282407.jpg[/img]

pmo 引入:

<dependency>
<groupId>org.apache.avro</groupId>
<artifactId>avro</artifactId>
<version>1.7.7</version>
</dependency>

<build>
<plugins>
<plugin>
<groupId>org.apache.avro</groupId>
<artifactId>avro-maven-plugin</artifactId>
<version>1.7.7</version>
<executions>
<execution>
<phase>generate-sources</phase>
<goals>
<goal>schema</goal>
</goals>
<configuration>
<sourceDirectory>${project.basedir}/src/main/avro/</sourceDirectory>
<outputDirectory>${project.basedir}/src/main/java/</outputDirectory>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>1.6</source>
<target>1.6</target>
</configuration>
</plugin>
</plugins>
</build>


user.avsc

{"namespace": "com.gym.backadmin.controller",
"type": "record",
"name": "User",
"fields": [
{"name": "name", "type": "string"},
{"name": "favorite_number", "type": ["int", "null"]},
{"name": "favorite_color", "type": ["string", "null"]}
]
}


特殊注意:通过以上avsc文件可以直接生成User类代码。


public class Deserializing {
public static void main(String[] args) throws Exception {
File file = new File("users.avro");
DatumReader<User> userDatumReader = new SpecificDatumReader<User>(User.class);
DataFileReader<User> dataFileReader = new DataFileReader<User>(file, userDatumReader);
User user = null;
while (dataFileReader.hasNext()) {
// Reuse user object by passing it to next(). This saves us from
// allocating and garbage collecting many objects for files with
// many items.
user = dataFileReader.next(user);
System.out.println(user);
}
}
}





public class Serializing {
public static void main(String[] args) throws IOException {
User user1 = new User();
user1.setName("Alyssa");
user1.setFavoriteNumber(256);
// Leave favorite color null

// Alternate constructor
User user2 = new User("Ben", 7, "red");

// Construct via builder
User user3 = User.newBuilder()
.setName("Charlie")
.setFavoriteColor("blue")
.setFavoriteNumber(null)
.build();
DatumWriter<User> userDatumWriter = new SpecificDatumWriter<User>(User.class);
DataFileWriter<User> dataFileWriter = new DataFileWriter<User>(userDatumWriter);
dataFileWriter.create(user1.getSchema(), new File("users.avro"));
dataFileWriter.append(user1);
dataFileWriter.append(user2);
dataFileWriter.append(user3);
dataFileWriter.close();
}
}
kafka-producer-config: key-serializer: org.apache.kafka.common.serialization.StringSerializer value-serializer: io.confluent.kafka.serializers.KafkaAvroSerializer compression-type: snappy acks: all batch-size: 16384 batch-size-boost-factor: 100 linger-ms: 5 request-timeout-ms: 60000 retry-count: 5 kafka-consumer-config: key-deserializer: org.apache.kafka.common.serialization.StringDeserializer value-deserializer: io.confluent.kafka.serializers.KafkaAvroDeserializer auto-offset-reset: earliest specific-avro-reader-key: specific.avro.reader specific-avro-reader: true batch-listener: true auto-startup: true concurrency-level: 3 session-timeout-ms: 10000 max-poll-interval-ms: 300000 heartbeat-interval-ms: 3000 poll-timeout-ms: 150 max-poll-records: 500 max-partition-fetch-bytes-default: 1048576 max-partition-fetch-bytes-boost-factor: 1 # 暂定只消费任务审核信息 mission-approval-consumer-group-id: mission-approval-topic-consumer route-created-consumer-group-id: route-created-topic-consumer mission-execution-finished-consumer-group-id: mission-execution-finished-topic-consumer inspection-analysis-finished-consumer-group-id: inspection-analysis-finished-topic-consumer kafka消费者和生产者的 key-serializer: org.apache.kafka.common.serialization.StringSerializer value-serializer: io.confluent.kafka.serializers.KafkaAvroSerializer和 key-deserializer: org.apache.kafka.common.serialization.StringDeserializer value-deserializer: io.confluent.kafka.serializers.KafkaAvroDeserializer不一样呢
11-26
评论
成就一亿技术人!
拼手气红包6.0元
还能输入1000个字符
 
红包 添加红包
表情包 插入表情
 条评论被折叠 查看
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值