当前位置: 首页 > news >正文

Flink2.1.1-WordCount示例

安装

Flink2.1.1 docker安装

Java代码示例

pom.xml

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"><modelVersion>4.0.0</modelVersion><groupId>com.example.flink</groupId><artifactId>flink-wordcount</artifactId><version>1.0.0</version><name>flink-wordcount</name><packaging>jar</packaging><properties><project.build.sourceEncoding>UTF-8</project.build.sourceEncoding><maven.compiler.source>11</maven.compiler.source><maven.compiler.target>11</maven.compiler.target><flink.version>2.1.1</flink.version><scala.binary.version>2.12</scala.binary.version><log4j.version>2.24.3</log4j.version><commons-math3.version>3.6.1</commons-math3.version><lombok.version>1.18.26</lombok.version></properties><dependencies><!-- Flink Streaming 核心依赖 --><dependency><groupId>org.apache.flink</groupId><artifactId>flink-streaming-java</artifactId><version>${flink.version}</version><scope>provided</scope></dependency><!-- Flink 客户端依赖,用于本地执行 --><dependency><groupId>org.apache.flink</groupId><artifactId>flink-clients</artifactId><version>${flink.version}</version><scope>provided</scope></dependency><dependency><groupId>org.apache.commons</groupId><artifactId>commons-math3</artifactId><version>${commons-math3.version}</version></dependency></dependencies><build><plugins><!-- Java 编译插件 --><plugin><groupId>org.apache.maven.plugins</groupId><artifactId>maven-compiler-plugin</artifactId><version>3.8.1</version><configuration><source>${maven.compiler.source}</source><target>${maven.compiler.target}</target></configuration></plugin><plugin><groupId>org.apache.maven.plugins</groupId><artifactId>maven-shade-plugin</artifactId><version>3.4.1</version><executions><execution><phase>package</phase><goals><goal>shade</goal></goals><configuration><transformers><transformerimplementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer"><mainClass>com.example.flink.WordCount</mainClass></transformer><transformerimplementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/></transformers></configuration></execution></executions></plugin></plugins></build>
</project>

代码

package com.example.flink;import org.apache.flink.api.common.functions.FlatMapFunction;
import org.apache.flink.api.java.tuple.Tuple2;
import org.apache.flink.streaming.api.datastream.DataStream;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.util.Collector;/*** Flink WordCount 运行示例*/
public class WordCount {public static void main(String[] args) throws Exception {// 1. 创建本地执行环境final StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();// 2. 创建一个静态的数据源DataStream<String> text = env.fromElements("To be, or not to be, that is the question:","Whether 'tis nobler in the mind to suffer","The slings and arrows of outrageous fortune,","Or to take arms against a sea of troubles","And by opposing end them.");// 3. 执行转换和计算DataStream<Tuple2<String, Integer>> counts =// 将每行文本分割成单词text.flatMap(new FlatMapFunction<String, Tuple2<String, Integer>>() {@Overridepublic void flatMap(String value, Collector<Tuple2<String, Integer>> out) {String[] words = value.toLowerCase().split("\\s+");for (String word : words) {// 过滤掉空字符串和标点符号if (word.length() > 0) {out.collect(new Tuple2<>(word.replaceAll("[^a-z]", ""), 1));}}}})// 按单词进行分组.keyBy(value -> value.f0)// 对每个单词的计数进行求和.sum(1);// 4. 输出结果到控制台counts.print();// 5. 执行作业env.execute("Socket WordCount");}
}

运行示例

上传WordCount的jar包到flink

upload_wordcount_jar

WordCount任务

wordcount_job

WordCount任务日志

wordcount_task_log

http://www.jsqmd.com/news/119366/

相关文章:

  • 【Open-AutoGLM收益监控终极方案】:5分钟搭建实时收益提醒系统
  • Flink2.1.1-docker安装
  • Open-AutoGLM会议纪要黑科技(90%团队还不知道的AI提效神器)
  • 校园IT负责人必看:Open-AutoGLM如何解决传统预约系统的4大痛点?
  • Open-AutoGLM用药提醒实战指南:5步搭建专属健康守护系统
  • JavaSE——带返回值的方法
  • 当 LinkedList 不是列表时,速度快的兔子都追不上!
  • 【Open-AutoGLM会议纪要生成全攻略】:3大核心技术揭秘与落地实践
  • 揭秘2025:国内全自动粘钉一体机一线厂家实力推荐榜,技术好的全自动粘钉一体机解决方案与实力解析 - 品牌推荐师
  • 【行业首曝】Open-AutoGLM高并发场景压测报告:支撑万级并发预约的底层逻辑
  • Java 岗面试 99 题 (含答案):JVM+Spring+MySQL+ 线程池 + 锁
  • 2025年大连值得信赖的BIP企业排行,人力云/好业财/协同云/税务云/好会计/财务云/易代账/供应链云/好生意BIP服务商选哪家 - 品牌推荐师
  • hot100 238.除自身以外的数组的乘积
  • 网络调试助手链接服务器
  • Open-AutoGLM保险管理实战指南(精准提醒+自动续保)
  • 【Java毕设源码分享】基于springboot+vue的的大学生创业网站的建设及应用(程序+文档+代码讲解+一条龙定制)
  • 还在手动查收益?AutoGLM自动化查询方案让你效率提升10倍,省时又精准
  • 还在手动查收益?AutoGLM自动化查询方案让你效率提升10倍,省时又精准
  • 揭秘Open-AutoGLM待办同步黑科技:如何实现跨平台零延迟数据同步
  • 从0到上线:中小企业如何用Open-AutoGLM搭建专属证件照服务平台
  • 为什么你的理财收益总算不准?深度解析Open-AutoGLM数据对接常见陷阱
  • Open-AutoGLM系统维护窗口期曝光:最佳预约时间竟然是这个时段!
  • 好写作AI:你的论文被“审稿人雷达”扫出AI味了吗?
  • 【Java毕设源码分享】基于springboot+vue的的大学生二手电子产品交易平台设计与实现(程序+文档+代码讲解+一条龙定制)
  • 【Java毕设源码分享】基于springboot+vue的的大学生二手闲置物品置换交易管理系统设计与实现(程序+文档+代码讲解+一条龙定制)
  • Open-AutoGLM真的能根治“号贩子”问题吗:一线医院实测数据曝光
  • py每日spider案例之短视频解析接口
  • Open-AutoGLM体检报告集成实战(企业级应用案例深度剖析)
  • Open-AutoGLM待办事项同步实战指南(从配置到自动化部署)
  • 紧急修复提示:Open-AutoGLM地理编码偏差问题该如何应对?