十一.网站首页高可用nginx+lua+redis
面对数据特别多,用户要求速度快,后台压力大的网页,我们的解决方案是什么?
正常情况下我们都是html + javascript + ajax ---- controller + service + dao ---- 数据库
速度慢 - 因为我们走的是【java代码】和【关系型数据库】
并发低 - 服务器tomcat
压力大 - 数据库及JVM
所以我们选用浏览器 - nginx - (接收请求,处理请求) lua - redis这个技术链来解决
nginx:
web服务器
虚拟主机
反向代理和负载均衡
lua - 是一个脚本语言,速度特别快!
redis-非关系型数据库,速度比mysql快!
一. nginx +lua + redis实现广告缓存
1.OpenResty
OpenResty(又称:ngx_openresty) 是一个基于 NGINX 的可伸缩的 Web 平台,就相当于封装了nginx,并且集成了LUA脚本,安装好
2.缓存预热与二级缓存查询
2.1将mysql里的数据查询出来存入redis被称为缓存预热
2.2编写lua脚本实现二级缓存读取
3.代码实现
3.1.1缓存预热
(1)连接mysql ,按照广告分类ID读取广告列表,转换为json字符串。
(2)连接redis,将广告列表json字符串存入redis 。
在/root/lua目录下创建ad_load.lua ,实现连接mysql 查询数据 并存储到redis中。
ngx.header.content_type="application/json;charset=utf8"
local cjson = require("cjson")
local mysql = require("resty.mysql")
local uri_args = ngx.req.get_uri_args()
local position = uri_args["position"]
local db = mysql:new()
db:set_timeout(1000)
local props = {
host = "192.168.200.128",
port = 3306,
database = "changgou_business",
user = "root",
password = "root"
}
local res = db:connect(props)
local select_sql = "select url,image from tb_ad where status ='1' and position='"..position.."' and start_time<= NOW() AND end_time>= NOW()"
res = db:query(select_sql)
db:close()
local redis = require("resty.redis")
local red = redis:new()
red:set_timeout(2000)
local ip ="192.168.200.128"
local port = 6379
red:connect(ip,port)
red:set("ad_"..position,cjson.encode(res))
red:close()
ngx.say("{flag:true}")
修改/usr/local/openresty/nginx/conf/nginx.conf文件:
代码如下:
#user nobody;
user root root;
worker_processes 1;
#error_log logs/error.log;
#error_log logs/error.log notice;
#error_log logs/error.log info;
#pid logs/nginx.pid;
events {
worker_connections 1024;
}
http {
include mime.types;
default_type application/octet-stream;
sendfile on;
#tcp_nopush on;
#keepalive_timeout 0;
keepalive_timeout 65;
#gzip on;
server {
listen 80;
server_name localhost;
charset utf-8;
#access_log logs/host.access.log main;
# 添加
location /ad_update {
content_by_lua_file /root/lua/ad_update.lua;
}
# redirect server error pages to the static page /50x.html
#
error_page 500 502 503 504 /50x.html;
location = /50x.html {
root html;
}
}
}
重新启动nginx
3.1.2广告缓存读取
通过lua脚本直接从redis中获取数据即可。
在/root/lua目录下创建ad_read.lua
ngx.header.content_type="application/json;charset=utf8"
local uri_args = ngx.req.get_uri_args();
local position = uri_args["position"];
local redis = require("resty.redis");
local red = redis:new()
red:set_timeout(2000)
local ok, err = red:connect("192.168.200.128", 6379)
local rescontent=red:get("ad_"..position)
ngx.say(rescontent)
red:close()
在/usr/local/openresty/nginx/conf/nginx.conf中server下添加配置
location /ad_read {
content_by_lua_file /root/lua/ad_read.lua;
}
测试 http://192.168.200.128/ad_read?position=web_index_lb 输出
[{"url":"img\/banner1.jpg","image":"img\/banner1.jpg"},{"url":"img\/banner2.jpg","image":"img\/banner2.jpg"}]
3.2.1二级缓存-加入openresty本地缓存
1.先查询openresty本地缓存 如果没有再查询redis中的数据
- 修改/root/lua目录下ad_read文件, 内容如下:
ngx.header.content_type="application/json;charset=utf8"
local uri_args = ngx.req.get_uri_args();
local position = uri_args["position"];
local cache_ngx = ngx.shared.dis_cache;
local adCache = cache_ngx:get('ad_cache_'..position);
if adCache == "" or adCache == nil then
local redis = require("resty.redis");
local red = redis:new()
red:set_timeout(2000)
local ok, err = red:connect("192.168.200.128", 6379)
local rescontent=red:get("ad_"..position)
ngx.say(rescontent)
red:close()
cache_ngx:set('ad_cache_'..position, rescontent, 10*60);
else
ngx.say(adCache)
end
-
修改nginx配置文件vi /usr/local/openresty/nginx/conf/nginx.conf ,http节点下添加配置:
#包含redis初始化模块 lua_shared_dict dis_cache 5m; #共享内存开启
十二.Canal技术同步缓存和索引库
缓存:缓存可以大量减少数据库的压力!
1.缓存只是一种优化,不能影响正常业务逻辑运行。
2.查询数据库之前,先查询缓存。(看一下缓存 中有没有,如果有直接返回。)
3.查询数据库之后,要添加缓存。(添加到缓存,以备下一次查询使用。)
4.有缓存,就要考虑缓存同步。
缓存同步的方式:
1.及时同步:当数据库发生改变的时候,缓存要清除相关数据或修改相关数据。常用正确性要求比较高。
2.定时同步:当数据库发生改变的时候,缓存并不是立即变化,而是用定时器去查询变化后再同步缓存。
3.延时同步:给缓存添加有效期,有效期到了缓存自动删除。
什么时候同步缓存?时机很重要!
当数据进行增删改的时候,就要同步数据到redis中!!
原始的解决是:找到所有的增删改的方法进行修改。-- 这种方式工作量特别大。
现在的解决是:只要是增删改最终的目的是改动数据库啊! --- 监听数据库变化就可以得到什么数据发生了改变。
监听数据库技术-Canal
Canal数据库监控技术,对数据库中的数据发生增删改的时候会调用canal代码。
原理:canal是模拟了mysql的备份机,当mysql数据发生改变的时候会把信息发送日志系统,canal就得到日志进行处理。
同步缓存
当Canal监听到增删改消息后,发送RabbitMQ到同步缓存的业务逻辑中,同步缓存的业务逻辑调用nginx的服务
实现流程:
1.当后台用户调用广告服务修改mysql的时候
2.cannal微服务监听数据库广告表,如果发生改动,将改动的"position"属性发送给rabbitMQ
3.rabbitMQ接收到消息后通过OkHttpClient发送远程调用请求Nginx进行缓存预热来达到缓存同步的效果
canal微服务
application.properties编写
canal.client.instances.example.host=192.168.200.128
canal.client.instances.example.port=11111
canal.client.instances.example.batchSize=1000
spring.rabbitmq.host=192.168.200.128
引导类CanalApplication
@SpringBootApplication
@EnableCanalClient
public class CanalApplication {
public static void main(String[] args) {
SpringApplication.run(CanalApplication.class, args);
}
}
RabbitMQ配置类
package com.itheima.canal.config;
import org.springframework.amqp.core.*;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
@Configuration
public class RabbitMQConfig {
//交换机名称
public static final String GOODS_UP_EXCHANGE="goods_up_exchange";
//交换机名称
public static final String GOODS_DOWN_EXCHANGE="goods_down_exchange";
//定义队列名称
public static final String AD_UPDATE_QUEUE="ad_update_queue";
//定义队列名称
public static final String SEARCH_ADD_QUEUE="search_add_queue";
//定义队列名称
public static final String SEARCH_DEL_QUEUE="search_del_queue";
//声明队列
@Bean
public Queue queue(){
return new Queue(AD_UPDATE_QUEUE);
}
//声明下架的交换机
@Bean(GOODS_DOWN_EXCHANGE)
public Exchange GOODS_DOWN_EXCHANGE(){
return ExchangeBuilder.fanoutExchange(GOODS_DOWN_EXCHANGE).durable(true).build();
}
//声明上架的交换机
@Bean(GOODS_UP_EXCHANGE)
public Exchange exchange(){
return ExchangeBuilder.fanoutExchange(GOODS_UP_EXCHANGE).durable(true).build();
}
//声明上架的队列
@Bean(SEARCH_ADD_QUEUE)
public Queue SEARCH_ADD_QUEUE(){
return new Queue(SEARCH_ADD_QUEUE);
}
//声明下架的队列
@Bean(SEARCH_DEL_QUEUE)
public Queue SEARCH_DEL_QUEUE(){
return new Queue(SEARCH_DEL_QUEUE);
}
//绑定上架队列和交换机
@Bean
public Binding SEARCH_ADD_QUEUE_BINDING(@Qualifier(GOODS_UP_EXCHANGE)Exchange exchange,@Qualifier(SEARCH_ADD_QUEUE)Queue queue){
return BindingBuilder.bind(queue).to(exchange).with("").noargs();
}
//绑定下架队列和交换机
@Bean
public Binding SEARCH_DEL_QUEUE_BINDING(@Qualifier(GOODS_DOWN_EXCHANGE)Exchange exchange,@Qualifier(SEARCH_DEL_QUEUE)Queue queue){
return BindingBuilder.bind(queue).to(exchange).with("").noargs();
}
}
BusinessListener监听类
package com.itheima.canal.listener;
import com.alibaba.otter.canal.protocol.CanalEntry;
import com.itheima.canal.config.RabbitMQConfig;
import com.xpand.starter.canal.annotation.CanalEventListener;
import com.xpand.starter.canal.annotation.ListenPoint;
import org.springframework.amqp.rabbit.core.RabbitTemplate;
import org.springframework.beans.factory.annotation.Autowired;
/**
* @author ZJ
*/
@CanalEventListener
public class BusinessListener {
@Autowired
private RabbitTemplate rabbitTemplate;
@ListenPoint(schema = "changgou_business", table = {"tb_ad"})
public void adUpdate(CanalEntry.EventType eventType, CanalEntry.RowData rowData) {
for (CanalEntry.Column column : rowData.getAfterColumnsList()) {
if("position".equals(column.getName())){
System.out.println("发送最新的数据到MQ:"+column.getValue());
rabbitTemplate.convertAndSend("", RabbitMQConfig.AD_UPDATE_QUEUE,column.getValue());
}
}
}
}
广告微服务
AdListener
package com.changgou.business.listener;
import okhttp3.*;
import okhttp3.Request.Builder;
import org.springframework.amqp.rabbit.annotation.RabbitListener;
import org.springframework.stereotype.Component;
import java.io.IOException;
@Component
public class AdListener {
@RabbitListener(queues = "ad_update_queue")
public void receiveMessage(String message){
//发起远程调用
OkHttpClient okHttpClient = new OkHttpClient();
String url = "http://192.168.200.128/ad_update?position="+message;;
Request request = new Request.Builder().url(url).build();
Call call = okHttpClient.newCall(request);
call.enqueue(new Callback() {
@Override
public void onFailure(Call call, IOException e) {
//请求失败
e.printStackTrace();
}
@Override
public void onResponse(Call call, Response response) throws IOException {
//请求成功
System.out.println("请求成功:"+response.message());
}
});
}
}
- 当同步缓存的监听器收到消息后,调用Nginx服务
- nginx服务调用lua代码
- lua代码请求mysql进行条件查询,把查询的结果添加到redis中。
利用elasticsearch 实现搜索功能
分析:
商品上架的时候,要添加索引库。当数据库发生数据上架的时候,
就进行查询出此条数据,添加到索引库中。
实现流程:
1.后台用户通过商品上架微服务请求mysql
2.cannal中定义spuListener监听spu表中is_marketable字段的变化,并将spuId发送给rabbitMQ
3.RabbitMQ接收到消息后存在队列当中,搜索微服务中的监听器监听改队列,当监听到的时候,执行添加索引库的操作(通过feign的远程调用调用商品服务中的通过spuId查询sku列表的功能)
4.同时在搜索微服务中添加创建索引库结构 导入全部数据到ES索引库这两个功能
下架原理同上
canal微服务
SpuListener
package com.itheima.canal.listener;
import com.alibaba.otter.canal.protocol.CanalEntry;
import com.itheima.canal.config.RabbitMQConfig;
import com.xpand.starter.canal.annotation.CanalEventListener;
import com.xpand.starter.canal.annotation.ListenPoint;
import org.springframework.amqp.rabbit.core.RabbitTemplate;
import org.springframework.beans.factory.annotation.Autowired;
import java.util.HashMap;
import java.util.Map;
/**
* @author ZJ
*/
@CanalEventListener
public class SpuListener {
@Autowired
private RabbitTemplate rabbitTemplate;
/**
* spu 表更新
* @param eventType
* @param rowData
*/
@ListenPoint(schema = "changgou_goods", table = {"tb_spu"},eventType = CanalEntry.EventType.UPDATE )
public void spuUp(CanalEntry.EventType eventType, CanalEntry.RowData rowData) {
System.err.println("tb_spu表数据发生变化");
//修改前数据
Map<String,String> oldMap=new HashMap<>();
rowData.getBeforeColumnsList().forEach((c)->oldMap.put(c.getName(),c.getValue()));
//修改后数据
Map<String,String> newMap=new HashMap<>();
rowData.getAfterColumnsList().forEach((c)->newMap.put(c.getName(),c.getValue()));
//is_marketable 由0改为1表示上架
if("0".equals(oldMap.get("is_marketable")) && "1".equals(newMap.get("is_marketable")) ){
rabbitTemplate.convertAndSend(RabbitMQConfig.GOODS_UP_EXCHANGE,"",newMap.get("id")); //发送到mq商品上架交换器上
}
//is_marketable 由1改为0表示下架
if("1".equals(oldMap.get("is_marketable")) && "0".equals(newMap.get("is_marketable")) ){
rabbitTemplate.convertAndSend(RabbitMQConfig.GOODS_DOWN_EXCHANGE,"",newMap.get("id")); //发送到mq商品上架交换器上
}
}
}
搜索微服务
GoodsUpListener
package com.changgou.search.listener;
import com.changgou.search.config.RabbitMQConfig;
import com.changgou.search.service.EsManagerService;
import org.springframework.amqp.rabbit.annotation.RabbitListener;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Component;
@Component
public class GoodsUpListener {
@Autowired
private EsManagerService esManagerService;
@RabbitListener(queues = RabbitMQConfig.SEARCH_ADD_QUEUE)
public void receiveMessage(String spuId){
esManagerService.importDataToESBySpuId(spuId);
}
}
EsManagerService
package com.changgou.search.service;
public interface EsManagerService {
/**
* 创建索引库结构
*/
void createIndexAndMapping();
/**
* 导入全部数据到ES索引库
*/
void importAll();
/**
* 根据spuid导入数据到ES索引库
*/
void importDataToESBySpuId(String spuId);
/**
* 根据spuid删除索引
*/
void deleteDataToESBySpuId(String spuId);
}
EsManagerServiceImpl
package com.changgou.search.service.impl;
import com.alibaba.fastjson.JSON;
import com.changgou.goods.feign.SkuFeign;
import com.changgou.goods.pojo.Sku;
import com.changgou.search.dao.ESManagerMapper;
import com.changgou.search.pojo.SkuInfo;
import com.changgou.search.service.EsManagerService;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.data.elasticsearch.core.ElasticsearchTemplate;
import org.springframework.stereotype.Service;
import java.util.List;
import java.util.Map;
@Service
public class EsManagerServiceImpl implements EsManagerService {
@Autowired
private ElasticsearchTemplate elasticsearchTemplate;
@Autowired
private SkuFeign skuFeign;
@Autowired
private ESManagerMapper esManagerMapper;
/**
* 创建索引库结构
*/
@Override
public void createIndexAndMapping() {
//创建索引
elasticsearchTemplate.createIndex(SkuInfo.class);
//创建映射
elasticsearchTemplate.putMapping(SkuInfo.class);
}
@Override
public void importAll() {
//查询sku集合
List<Sku> skuList = skuFeign.findSkuListBySpuId("all");
if (skuList == null || skuList.size() <= 0) {
throw new RuntimeException("当前没有数据被查询到,无法导入索引库");
}
//将sku转换为json
String jsonSkuList = JSON.toJSONString(skuList);
//将json转换为skuInfo
List<SkuInfo> skuInfos = JSON.parseArray(jsonSkuList, SkuInfo.class);
for (SkuInfo skuInfo : skuInfos) {
Map specMap = JSON.parseObject(skuInfo.getSpec(), Map.class);
skuInfo.setSpecMap(specMap);
}
//导入索引库
esManagerMapper.saveAll(skuInfos);
}
@Override
public void importDataToESBySpuId(String spuId) {
List<Sku> skuList = skuFeign.findSkuListBySpuId(spuId);
if (skuList == null || skuList.size() <= 0) {
throw new RuntimeException("当前没有数据被查询到,无法导入索引库");
}
String jsonSkuList = JSON.toJSONString(skuList);
List<SkuInfo> skuInfos = JSON.parseArray(jsonSkuList, SkuInfo.class);
for (SkuInfo skuInfo : skuInfos) {
Map map = JSON.parseObject(skuInfo.getSpec(), Map.class);
skuInfo.setSpecMap(map);
}
esManagerMapper.saveAll(skuInfos);
}
@Override
public void deleteDataToESBySpuId(String spuId) {
List<Sku> skuList = skuFeign.findSkuListBySpuId(spuId);
if (skuList == null || skuList.size() <= 0) {
throw new RuntimeException("当前没有数据被查询到,无法导入索引库");
}
for (Sku sku : skuList) {
esManagerMapper.deleteById(Long.parseLong(sku.getId()));
}
}
}
ESManagerMapper
package com.changgou.search.dao;
import com.changgou.search.pojo.SkuInfo;
import org.springframework.data.elasticsearch.repository.ElasticsearchRepository;
public interface ESManagerMapper extends ElasticsearchRepository<SkuInfo,Long> {
}
ESManagerController
package com.changgou.search.controller;
import com.changgou.entity.Result;
import com.changgou.entity.StatusCode;
import com.changgou.search.service.EsManagerService;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RestController;
@RestController
@RequestMapping("/manager")
public class ESManagerController {
@Autowired
private EsManagerService esManagerService;
//创建索引库结构
@GetMapping("/create")
public Result create(){
esManagerService.createIndexAndMapping();
return new Result(true, StatusCode.OK,"创建索引库结构成功");
}
//导入全部数据
@GetMapping("/importAll")
public Result importAll(){
esManagerService.importAll();
return new Result(true, StatusCode.OK,"导入全部数据成功");
}
}
在商品微服务中还需要将之前的RabbitMQConfig放进来一份,用来给监听器调用
在changgou_service_search_api中添加实体类用来接收数据库查询出来的List并将其装换为自定义的实体类skuInfo,用来添加到索引库中
skuInfo
package com.changgou.search.pojo;
import org.springframework.data.elasticsearch.annotations.Document;
import org.springframework.data.elasticsearch.annotations.Field;
import org.springframework.data.elasticsearch.annotations.FieldType;
import javax.persistence.Id;
import java.io.Serializable;
import java.util.Date;
import java.util.Map;
@Document(indexName = "skuinfo", type = "docs")
public class SkuInfo implements Serializable {
//商品id,同时也是商品编号
@Id
@Field(index = true, store = true, type = FieldType.Keyword)
private Long id;
//SKU名称
@Field(index = true, store = true, type = FieldType.Text, analyzer = "ik_smart")
private String name;
//商品价格,单位为:元
@Field(index = true, store = true, type = FieldType.Double)
private Long price;
//库存数量
@Field(index = true, store = true, type = FieldType.Integer)
private Integer num;
//商品图片
@Field(index = false, store = true, type = FieldType.Text)
private String image;
//商品状态,1-正常,2-下架,3-删除
@Field(index = true, store = true, type = FieldType.Keyword)
private String status;
//创建时间
private Date createTime;
//更新时间
private Date updateTime;
//是否默认
@Field(index = true, store = true, type = FieldType.Keyword)
private String isDefault;
//SPUID
@Field(index = true, store = true, type = FieldType.Long)
private Long spuId;
//类目ID
@Field(index = true, store = true, type = FieldType.Long)
private Long categoryId;
//类目名称
@Field(index = true, store = true, type = FieldType.Keyword)
private String categoryName;
//品牌名称
@Field(index = true, store = true, type = FieldType.Keyword)
private String brandName;
//规格
private String spec;
//规格参数
private Map<String, Object> specMap;
getter/getter 略
}
在service_goods_api中添加feign的远程调用接口
package com.changgou.goods.feign;
import com.changgou.goods.pojo.Sku;
import org.springframework.cloud.openfeign.FeignClient;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.PathVariable;
import java.util.List;
@FeignClient(name = "goods")
public interface SkuFeign {
/**
* 多条件搜索品牌数据
*/
@GetMapping("/sku/spu/{spuId}")
List<Sku> findSkuListBySpuId(@PathVariable("spuId")String spuId);
}