log4j+kafka+storm+mongodb+mysql 日誌處理

百聯達發表於2016-01-26
一:背景
隨著系統的分散式部署,日誌的集中處理顯得極為重要。本文整合log4j+kafka+storm+mongodb+mysql介紹一下日誌的集中處理過程。本方案已應用到線上的正式環境中。

log4j版本:log4j-1.2.17.jar
kafka版本:kafka_2.10-0.8.2.1
zookeeper版本:zookeeper-3.4.7.tar.gz
storm版本:apache-storm-0.9.5.tar.gz
mongodb 版本:mongodb-linux-x86_64-2.6.11.tgz
MySql版本:mysql-5.5.31.tar.gz

二:軟體安裝
1.kafka安裝  請參照http://blog.itpub.net/28624388/viewspace-1966761/
2.storm安裝  請參照http://blog.itpub.net/28624388/viewspace-1973943/
3.mongodb安裝 請參照http://blog.itpub.net/28624388/viewspace-1974449/
4.mysql安裝  請參照http://blog.itpub.net/28624388/viewspace-764718/

三:log4j與kafka整合
請參照http://blog.itpub.net/28624388/viewspace-1972027/
日誌格式:
logger.warn("處理請求|"+((null==userId || "".equals(userId.toString()))?"0":userId.toString())+"|"+system+"|"+version+"|" + request.getRequestURI() +"|"+GbdDateUtils.format(Calendar.getInstance().getTime(),"yyyy-MM-dd HH:mm:ss")+ "|" + (System.currentTimeMillis() - accessTime) + "|毫秒!");

四:kafka+storm+mongodb+mysql整合
1.程式碼結構




2.gmap-logs-deal-center模組介紹
該模組實現消費kafka收集到的日誌資訊,並通過storm簡單的處理儲存到Mongodb資料庫內,通過schedule將mongodb裡面的日誌資料進行統計分析儲存到Mysql資料庫內。
a)pom.xml配置檔案

點選(此處)摺疊或開啟

  1. <?xml version="1.0"?>
  2. <project
  3.     xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"
  4.     xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
  5.     <modelVersion>4.0.0</modelVersion>
  6.     <parent>
  7.         <groupId>com.gemdale.gmap</groupId>
  8.         <artifactId>GMap</artifactId>
  9.         <version>0.0.1-SNAPSHOT</version>
  10.     </parent>
  11.     <groupId>com.gemdale.gmap</groupId>
  12.     <artifactId>gmap-logs-deal-center</artifactId>
  13.     <version>0.0.1-SNAPSHOT</version>
  14.     <name>gmap-logs-deal-center</name>
  15.     <dependencies>
  16.         <dependency>
  17.             <groupId>${project.groupId}</groupId>
  18.             <artifactId>gmap-common</artifactId>
  19.             <version>${project.version}</version>
  20.         </dependency>
  21.         
  22.                 <dependency>
  23.             <groupId>${project.groupId}</groupId>
  24.             <artifactId>gmap-common-logs-bo</artifactId>
  25.             <version>${project.version}</version>
  26.         </dependency>

  27.         <dependency>
  28.             <groupId>${project.groupId}</groupId>
  29.             <artifactId>gmap-common-redis-support</artifactId>
  30.             <version>${project.version}</version>
  31.         </dependency>

  32.         <dependency>
  33.             <groupId>org.slf4j</groupId>
  34.             <artifactId>slf4j-log4j12</artifactId>
  35.         </dependency>

  36.         <dependency>
  37.             <groupId>org.apache.kafka</groupId>
  38.             <artifactId>kafka_2.10</artifactId>
  39.         </dependency>

  40.         <dependency>
  41.             <groupId>org.apache.kafka</groupId>
  42.             <artifactId>kafka-clients</artifactId>
  43.         </dependency>

  44.         <dependency>
  45.             <groupId>com.google.guava</groupId>
  46.             <artifactId>guava</artifactId>
  47.             <version>18.0</version>
  48.         </dependency>

  49.         <dependency>
  50.             <groupId>com.101tec</groupId>
  51.             <artifactId>zkclient</artifactId>
  52.         </dependency>

  53.         <dependency>
  54.             <groupId>org.apache.storm</groupId>
  55.             <artifactId>storm-kafka</artifactId>
  56.         </dependency>


  57.         <dependency>
  58.             <groupId>org.apache.storm</groupId>
  59.             <artifactId>storm-core</artifactId>
  60.         </dependency>


  61.         <dependency>
  62.             <groupId>javax.persistence</groupId>
  63.             <artifactId>persistence-api</artifactId>
  64.         </dependency>

  65.         <dependency>
  66.             <groupId>junit</groupId>
  67.             <artifactId>junit</artifactId>
  68.         </dependency>

  69.         <dependency>
  70.             <groupId>org.springframework</groupId>
  71.             <artifactId>spring-context</artifactId>
  72.         </dependency>

  73.         <dependency>
  74.             <groupId>org.springframework</groupId>
  75.             <artifactId>spring-test</artifactId>
  76.         </dependency>

  77.         <dependency>
  78.             <groupId>org.springframework</groupId>
  79.             <artifactId>spring-core</artifactId>
  80.         </dependency>

  81.         <dependency>
  82.             <groupId>org.springframework</groupId>
  83.             <artifactId>spring-beans</artifactId>
  84.         </dependency>

  85.         <dependency>
  86.             <groupId>org.springframework.data</groupId>
  87.             <artifactId>spring-data-mongodb</artifactId>
  88.         </dependency>
  89.         
  90.                 <dependency>
  91.             <groupId>${project.groupId}</groupId>
  92.             <artifactId>gmap-common-dao-support</artifactId>
  93.             <version>${project.version}</version>
  94.         </dependency>

  95.         <dependency>
  96.             <groupId>org.quartz-scheduler</groupId>
  97.             <artifactId>quartz</artifactId>
  98.         </dependency>


  99.     </dependencies>
  100. </project>
b)Start類

點選(此處)摺疊或開啟

  1. package com.enjoylink.gmap.logs.deal.center;

  2. import java.util.ArrayList;
  3. import java.util.List;

  4. import org.apache.log4j.Logger;
  5. import org.springframework.context.support.ClassPathXmlApplicationContext;
  6. import org.springframework.data.mongodb.core.MongoTemplate;
  7. import org.springframework.data.redis.core.RedisTemplate;

  8. import com.enjoyslink.gmap.logs.deal.center.stormkafka.StormKafkaBolt;
  9. import com.gemdale.gmap.common.redis.support.RedisUtil;
  10. import com.gemdale.gmap.common.util.ConfigureUtil;

  11. import backtype.storm.Config;
  12. import backtype.storm.LocalCluster;
  13. import backtype.storm.spout.SchemeAsMultiScheme;
  14. import backtype.storm.topology.TopologyBuilder;
  15. import storm.kafka.BrokerHosts;
  16. import storm.kafka.KafkaSpout;
  17. import storm.kafka.SpoutConfig;
  18. import storm.kafka.StringScheme;
  19. import storm.kafka.ZkHosts;

  20. /**
  21.  * @ClassName: Start
  22.  * @Description: TODO(用一句話描述這個類的用途)
  23.  * @author 資訊管理部-gengchong
  24.  * @date 2015年10月15日 下午5:42:52
  25.  *
  26.  */
  27. public class Start {
  28.     private static Logger logger = Logger.getLogger(Start.class);

  29.     @SuppressWarnings("resource")
  30.     public static void main(String[] args) {
  31.         // 進行spring的系統初始化化工作,進行必要的資料初始化
  32.         ClassPathXmlApplicationContext context = new ClassPathXmlApplicationContext("applicationContext.xml");

  33.         // 獲取介面,並提供服務
  34.         // 初始化REDIS服務
  35.         logger.info("初始化REDIS服務...");
  36.         RedisUtil.initRedisTemplate((RedisTemplate) context.getBean("redisTemplate"));
  37.       
  38.         logger.info("初始化storm-kafka......");
  39.         //配置kafka topic
  40.         String topic = ConfigureUtil.getProperty("topic", "gmap");
  41.         //配置kafka zookeeper
  42.         BrokerHosts hosts = new ZkHosts(ConfigureUtil.getProperty("kafka.zk"));
  43.         //配置storm spout
  44.         SpoutConfig spoutConfig = new SpoutConfig(hosts, topic, "/kafkastorm", "gmapKafka");
  45.         spoutConfig.scheme = new SchemeAsMultiScheme(new StringScheme());
  46.         //配置storm zookeeper
  47.         List<String> zkServers = new ArrayList<>();
  48.         zkServers.add(ConfigureUtil.getProperty("storm.zk.host"));
  49.         spoutConfig.zkServers = zkServers;
  50.         spoutConfig.zkPort = Integer.valueOf(ConfigureUtil.getProperty("storm.zk.port"));

  51.         spoutConfig.forceFromStart = false;
  52.         spoutConfig.socketTimeoutMs = 60 * 1000;
  53.         //配置topology
  54.         TopologyBuilder builder = new TopologyBuilder();
  55.         builder.setSpout("spout", new KafkaSpout(spoutConfig),
  56.                 Integer.valueOf(ConfigureUtil.getProperty("storm.max.thread")));
  57.         //配置storm bolt
  58.         builder.setBolt("bolt",new StormKafkaBolt(), Integer.valueOf(ConfigureUtil.getProperty("storm.max.thread")))
  59.                 .shuffleGrouping("spout");
  60.         //開啟topology
  61.         Config config = new Config();
  62.         config.setDebug(false);
  63.         new LocalCluster().submitTopology("topology", config, builder.createTopology());

  64.         logger.info("Gmap日誌處理中心啟動成功!");

  65.     }
  66. }
c).StormKafkaBolt類


d).transaction-context.xml

點選(此處)摺疊或開啟

  1. <beans xmlns="http://www.springframework.org/schema/beans" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:context="http://www.springframework.org/schema/context" xmlns:aop="http://www.springframework.org/schema/aop" xmlns:tx="http://www.springframework.org/schema/tx" xmlns:mongo="http://www.springframework.org/schema/data/mongo" xsi:schemaLocation="http://www.springframework.org/schema/beans
  2.     http://www.springframework.org/schema/beans/spring-beans-4.0.xsd
  3.     http://www.springframework.org/schema/context
  4.     http://www.springframework.org/schema/context/spring-context-4.0.xsd
  5.     http://www.springframework.org/schema/aop
  6.     http://www.springframework.org/schema/aop/spring-aop-4.0.xsd
  7.     http://www.springframework.org/schema/data/mongo
  8.     http://www.springframework.org/schema/data/mongo/spring-mongo-1.8.xsd
  9.     http://www.springframework.org/schema/tx
  10.     http://www.springframework.org/schema/tx/spring-tx-4.0.xsd">

  11.     <bean id="propertyConfigurer" class="org.springframework.beans.factory.config.PropertyPlaceholderConfigurer">
  12.         <property name="location">
  13.             <description>datasource config</description>
  14.             <value>classpath:context-datasource.properties</value>
  15.         </property>
  16.     </bean>

  17.     <mongo:db-factory host="${mongodb.host}" port="${mongodb.port}" dbname="${mongodb.database}" username="${mongodb.username}" password="${mongodb.password}" />


  18.     <bean id="mongoTemplate" class="org.springframework.data.mongodb.core.MongoTemplate">
  19.         <constructor-arg name="mongoDbFactory" ref="mongoDbFactory" />
  20.     </bean>

  21.     <mongo:mapping-converter id="converter" db-factory-ref="mongoDbFactory" />
  22.     <bean id="gridFsTemplate" class="org.springframework.data.mongodb.gridfs.GridFsTemplate">
  23.         <constructor-arg ref="mongoDbFactory" />
  24.         <constructor-arg ref="converter" />
  25.     </bean>

  26.      <bean id="jdbcTemplate" class="org.springframework.jdbc.core.JdbcTemplate">
  27.         <property name="dataSource" ref="platformTomcat" />
  28.     </bean>
  29.     
  30.       <bean id="jdbcReadTemplate" class="org.springframework.jdbc.core.JdbcTemplate">
  31.         <property name="dataSource" ref="platformReadTomcat" />
  32.     </bean>
  33.     
  34.     <bean id="transactionManager" class="org.springframework.jdbc.datasource.DataSourceTransactionManager">
  35.         <property name="dataSource" ref="platformTomcat"></property>
  36.     </bean>

  37.     <tx:annotation-driven transaction-manager="transactionManager" proxy-target-class="true"/>
  38. </beans>
e).GmapSystemLogsCollectSchedule類


3.gmap-common-logs-bo程式碼
GmapSystemLogBo類








來自 “ ITPUB部落格 ” ,連結:http://blog.itpub.net/28624388/viewspace-1982560/,如需轉載,請註明出處,否則將追究法律責任。

相關文章