【Java毕设】蛋糕烘焙的分享平台 SpringBoot+Vue框架 计算机毕业设计项目 Idea+Navicat+MySQL安装 附源码+文档+讲解

73 阅读4分钟

一、个人简介

💖💖作者:计算机编程果茶熊 💙💙个人简介:曾长期从事计算机专业培训教学,担任过编程老师,同时本人也热爱上课教学,擅长Java、微信小程序、Python、Golang、安卓Android等多个IT方向。会做一些项目定制化开发、代码讲解、答辩教学、文档编写、也懂一些降重方面的技巧。平常喜欢分享一些自己开发中遇到的问题的解决办法,也喜欢交流技术,大家有技术代码这一块的问题可以问我! 💛💛想说的话:感谢大家的关注与支持! 💜💜 网站实战项目 安卓/小程序实战项目 大数据实战项目 计算机毕业设计选题 💕💕文末获取源码联系计算机编程果茶熊

二、系统介绍

开发语言:Java+Python 数据库:MySQL 系统架构:B/S 后端框架:SpringBoot(Spring+SpringMVC+Mybatis)+Django 前端:Vue+HTML+CSS+JavaScript+jQuery

蛋糕烘焙分享平台是一个基于SpringBoot+Vue技术栈开发的垂直化分享社区系统。该平台专注于烘焙爱好者的交流互动,提供了完整的用户管理、内容发布、互动交流和商城购物功能模块。用户可以在平台上注册登录,发布自己的烘焙作品和制作心得,通过图文并茂的方式分享烘焙技巧和创意配方。系统支持作品的分类管理,用户可以按照蛋糕类型、难度等级、制作时间等维度进行内容筛选和浏览。平台集成了完善的互动功能,包括点赞、评论、收藏和关注机制,形成了良好的社区氛围。同时系统还提供了烘焙用品商城模块,用户可以直接购买制作工具和原材料,实现了从学习到实践的闭环服务。管理员可以通过后台系统对用户行为、内容质量和商品信息进行统一管理,确保平台内容的健康发展。整个系统采用前后端分离的架构设计,后端使用SpringBoot框架提供RESTful API接口,前端采用Vue.js构建响应式用户界面,数据库使用MySQL存储用户和内容信息,通过Redis实现缓存优化,提升了系统的性能和用户体验。

三、视频解说

蛋糕烘焙的分享平台-链接

四、部分功能展示

在这里插入图片描述 在这里插入图片描述 在这里插入图片描述 在这里插入图片描述 在这里插入图片描述 在这里插入图片描述 在这里插入图片描述

五、部分代码展示


import org.apache.spark.sql.SparkSession;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Service;
import org.springframework.web.multipart.MultipartFile;
import org.springframework.data.domain.Page;
import org.springframework.data.domain.PageRequest;
import org.springframework.transaction.annotation.Transactional;
import java.util.*;
import java.time.LocalDateTime;
@Service
public class BakingShareService {
    @Autowired
    private BakingWorkRepository bakingWorkRepository;
    @Autowired
    private UserRepository userRepository;
    @Autowired
    private RedisTemplate<String, Object> redisTemplate;
    @Autowired
    private FileUploadService fileUploadService;
    private SparkSession spark = SparkSession.builder().appName("BakingDataAnalysis").master("local[*]").getOrCreate();
    @Transactional
    public Result publishBakingWork(BakingWorkDto workDto, List<MultipartFile> images, Long userId) {
        User user = userRepository.findById(userId).orElseThrow(() -> new BusinessException("用户不存在"));
        if (user.getStatus() != 1) {
            return Result.error("账户状态异常,无法发布作品");
        }
        BakingWork work = new BakingWork();
        work.setTitle(workDto.getTitle());
        work.setDescription(workDto.getDescription());
        work.setIngredients(workDto.getIngredients());
        work.setSteps(workDto.getSteps());
        work.setCategoryId(workDto.getCategoryId());
        work.setDifficultyLevel(workDto.getDifficultyLevel());
        work.setMakeTime(workDto.getMakeTime());
        work.setUserId(userId);
        work.setCreateTime(LocalDateTime.now());
        work.setStatus(0);
        work.setLikeCount(0);
        work.setCommentCount(0);
        work.setViewCount(0);
        List<String> imageUrls = new ArrayList<>();
        for (MultipartFile image : images) {
            String imageUrl = fileUploadService.uploadImage(image, "baking-works");
            imageUrls.add(imageUrl);
        }
        work.setImages(String.join(",", imageUrls));
        BakingWork savedWork = bakingWorkRepository.save(work);
        String userKey = "user:publish:count:" + userId;
        redisTemplate.opsForValue().increment(userKey, 1);
        redisTemplate.expire(userKey, Duration.ofDays(1));
        String categoryKey = "category:works:" + workDto.getCategoryId();
        redisTemplate.opsForZSet().add(categoryKey, savedWork.getId(), System.currentTimeMillis());
        return Result.success(savedWork);
    }
    public Result getUserInteractionData(Long userId, String timeRange) {
        List<BakingWork> userWorks = bakingWorkRepository.findByUserId(userId);
        Map<String, Object> interactionData = new HashMap<>();
        int totalLikes = userWorks.stream().mapToInt(BakingWork::getLikeCount).sum();
        int totalComments = userWorks.stream().mapToInt(BakingWork::getCommentCount).sum();
        int totalViews = userWorks.stream().mapToInt(BakingWork::getViewCount).sum();
        interactionData.put("totalLikes", totalLikes);
        interactionData.put("totalComments", totalComments);
        interactionData.put("totalViews", totalViews);
        interactionData.put("worksCount", userWorks.size());
        LocalDateTime startTime = getStartTimeByRange(timeRange);
        List<BakingWork> recentWorks = bakingWorkRepository.findByUserIdAndCreateTimeAfter(userId, startTime);
        Map<String, Integer> dailyData = new HashMap<>();
        for (BakingWork work : recentWorks) {
            String dateKey = work.getCreateTime().toLocalDate().toString();
            dailyData.put(dateKey, dailyData.getOrDefault(dateKey, 0) + 1);
        }
        interactionData.put("dailyPublishData", dailyData);
        Dataset<Row> workDataset = spark.createDataFrame(userWorks, BakingWork.class);
        Dataset<Row> categoryStats = workDataset.groupBy("categoryId").agg(
            functions.count("id").alias("count"),
            functions.avg("likeCount").alias("avgLikes"),
            functions.max("viewCount").alias("maxViews")
        );
        List<Row> categoryResults = categoryStats.collectAsList();
        Map<Long, Map<String, Object>> categoryAnalysis = new HashMap<>();
        for (Row row : categoryResults) {
            Long categoryId = row.getLong(0);
            Map<String, Object> stats = new HashMap<>();
            stats.put("count", row.getLong(1));
            stats.put("avgLikes", row.getDouble(2));
            stats.put("maxViews", row.getLong(3));
            categoryAnalysis.put(categoryId, stats);
        }
        interactionData.put("categoryAnalysis", categoryAnalysis);
        return Result.success(interactionData);
    }
    @Transactional
    public Result processProductOrder(Long userId, Long productId, Integer quantity, String deliveryAddress) {
        User user = userRepository.findById(userId).orElseThrow(() -> new BusinessException("用户不存在"));
        Product product = productRepository.findById(productId).orElseThrow(() -> new BusinessException("商品不存在"));
        if (product.getStock() < quantity) {
            return Result.error("商品库存不足");
        }
        String lockKey = "product:stock:lock:" + productId;
        Boolean lockAcquired = redisTemplate.opsForValue().setIfAbsent(lockKey, "1", Duration.ofSeconds(10));
        if (!lockAcquired) {
            return Result.error("系统繁忙,请稍后重试");
        }
        try {
            product = productRepository.findById(productId).get();
            if (product.getStock() < quantity) {
                return Result.error("商品库存不足");
            }
            BigDecimal totalAmount = product.getPrice().multiply(new BigDecimal(quantity));
            if (user.getBalance().compareTo(totalAmount) < 0) {
                return Result.error("账户余额不足");
            }
            Order order = new Order();
            order.setUserId(userId);
            order.setProductId(productId);
            order.setQuantity(quantity);
            order.setUnitPrice(product.getPrice());
            order.setTotalAmount(totalAmount);
            order.setDeliveryAddress(deliveryAddress);
            order.setStatus("PENDING");
            order.setCreateTime(LocalDateTime.now());
            order.setOrderNumber(generateOrderNumber());
            Order savedOrder = orderRepository.save(order);
            product.setStock(product.getStock() - quantity);
            product.setSalesCount(product.getSalesCount() + quantity);
            productRepository.save(product);
            user.setBalance(user.getBalance().subtract(totalAmount));
            userRepository.save(user);
            String userOrderKey = "user:orders:" + userId;
            redisTemplate.opsForList().leftPush(userOrderKey, savedOrder.getId());
            redisTemplate.expire(userOrderKey, Duration.ofDays(30));
            String productSalesKey = "product:sales:count:" + productId;
            redisTemplate.opsForValue().increment(productSalesKey, quantity);
            Dataset<Row> orderDataset = spark.read().format("jdbc")
                .option("url", "jdbc:mysql://localhost:3306/baking_db")
                .option("dbtable", "orders")
                .option("user", "root")
                .option("password", "password")
                .load();
            Dataset<Row> salesAnalysis = orderDataset
                .filter(col("createTime").gt(LocalDateTime.now().minusDays(30)))
                .groupBy("productId")
                .agg(sum("quantity").alias("totalSales"), avg("totalAmount").alias("avgAmount"));
            List<Row> salesResults = salesAnalysis.collectAsList();
            for (Row row : salesResults) {
                Long pid = row.getLong(0);
                Long totalSales = row.getLong(1);
                Double avgAmount = row.getDouble(2);
                String salesStatsKey = "product:sales:stats:" + pid;
                Map<String, Object> stats = new HashMap<>();
                stats.put("totalSales", totalSales);
                stats.put("avgAmount", avgAmount);
                redisTemplate.opsForHash().putAll(salesStatsKey, stats);
                redisTemplate.expire(salesStatsKey, Duration.ofHours(6));
            }
            return Result.success(savedOrder);
        } finally {
            redisTemplate.delete(lockKey);
        }
    }
}


六、部分文档展示

在这里插入图片描述

七、END

💕💕文末获取源码联系计算机编程果茶熊