会员可以在此提问,百战程序员老师有问必答
对大家有帮助的问答会被标记为“推荐”
看完课程过来浏览一下别人提的问题,会帮你学得更全面
截止目前,同学们一共提了 133662个问题

老师,我这里跟你写的一样的,为啥我的他没有保存数据

2022-04-24 09:15:26 [scrapy.utils.log] INFO: Scrapy 2.6.1 started (bot: scrapy03)

2022-04-24 09:15:26 [scrapy.utils.log] INFO: Versions: lxml 4.8.0.0, libxml2 2.9.12, cssselect 1.1.0, parsel 1.6.0, w3lib 1.22.0, Twisted 22.4.0, Python 3.9.2 (tags/v3.9.2:1a79785, Feb 19 2021, 13:44:55) [MSC v.1928 64 bit (AMD64)], pyOpenSSL 22.0.0 (OpenSSL 1.1.1m  14 Dec 2021), cryptography 36.0.1, Platform Windows-10-10.0.22000-SP0

2022-04-24 09:15:26 [scrapy.crawler] INFO: Overridden settings:

{'BOT_NAME': 'scrapy03',

 'DOWNLOAD_DELAY': 3,

 'NEWSPIDER_MODULE': 'scrapy03.spiders',

 'SPIDER_MODULES': ['scrapy03.spiders'],

 'USER_AGENT': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 '

               '(KHTML, like Gecko) Chrome/99.0.4844.35 Safari/537.36'}

2022-04-24 09:15:26 [scrapy.utils.log] DEBUG: Using reactor: twisted.internet.selectreactor.SelectReactor

2022-04-24 09:15:26 [scrapy.extensions.telnet] INFO: Telnet Password: 9a28e1c3964e206e

2022-04-24 09:15:26 [scrapy.middleware] INFO: Enabled extensions:

['scrapy.extensions.corestats.CoreStats',

 'scrapy.extensions.telnet.TelnetConsole',

 'scrapy.extensions.logstats.LogStats']

2022-04-24 09:15:27 [scrapy.middleware] INFO: Enabled downloader middlewares:

['scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware',

 'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware',

 'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware',

 'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware',

 'scrapy.downloadermiddlewares.retry.RetryMiddleware',

 'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware',

 'scrapy.spidermiddlewares.offsite.OffsiteMiddleware',

 'scrapy.spidermiddlewares.referer.RefererMiddleware',

 'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware',

 'scrapy.spidermiddlewares.depth.DepthMiddleware']

2022-04-24 09:15:27 [scrapy.middleware] INFO: Enabled item pipelines:

['scrapy03.pipelines.XSPipeline']

2022-04-24 09:15:27 [scrapy.core.engine] INFO: Spider opened

2022-04-24 09:15:27 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min) 

2022-04-24 09:15:27 [scrapy.extensions.telnet] INFO: Telnet console listening on 127.0.0.1:6023

2022-04-24 09:15:40 [filelock] DEBUG: Attempting to acquire lock 1927152576880 on C:\Users\zhangpei\AppData\Local\Programs\Python\Python39\lib\site-packages\tldextract\.suffix_cache/publicsuffix.org-tlds\de84b5ca2167d4c83e38fb162f2e8738.tldextract.json.lock

2022-04-24 09:15:40 [filelock] DEBUG: Lock 1927152576880 acquired on C:\Users\zhangpei\AppData\Local\Programs\Python\Python39\lib\site-packages\tldextract\.suffix_cache/publicsuffix.org-tlds\de84b5ca2167d4c83e38fb162f2e8738.tldextract.json.lock

2022-04-24 09:15:40 [filelock] DEBUG: Attempting to release lock 1927152576880 on C:\Users\zhangpei\AppData\Local\Programs\Python\Python39\lib\site-packages\tldextract\.suffix_cache/publicsuffix.org-tlds\de84b5ca2167d4c83e38fb162f2e8738.tldextract.json.lock

2022-04-24 09:15:40 [filelock] DEBUG: Lock 1927152576880 released on C:\Users\zhangpei\AppData\Local\Programs\Python\Python39\lib\site-packages\tldextract\.suffix_cache/publicsuffix.org-tlds\de84b5ca2167d4c83e38fb162f2e8738.tldextract.json.lock

2022-04-24 09:15:40 [scrapy.core.engine] DEBUG: Crawled (200) <GET https://www.qbiqu.com/0_1/> (referer: None)

2022-04-24 09:15:40 [scrapy.spidermiddlewares.offsite] DEBUG: Filtered offsite request to 'www.qbiqu.com': <GET https://www.qbiqu.com/0_1/1.html>

2022-04-24 09:15:40 [scrapy.core.engine] INFO: Closing spider (finished)

2022-04-24 09:15:40 [scrapy.statscollectors] INFO: Dumping Scrapy stats:

{'downloader/request_bytes': 297,

 'downloader/request_count': 1,

 'downloader/request_method_count/GET': 1,

 'downloader/response_bytes': 25899,

 'downloader/response_count': 1,

 'downloader/response_status_count/200': 1,

 'elapsed_time_seconds': 13.305067,

 'finish_reason': 'finished',

 'finish_time': datetime.datetime(2022, 4, 24, 1, 15, 40, 531737),

 'httpcompression/response_bytes': 173927,

 'httpcompression/response_count': 1,

 'log_count/DEBUG': 7,

 'log_count/INFO': 10,

 'offsite/domains': 1,

 'offsite/filtered': 1,

 'request_depth_max': 1,

 'response_received_count': 1,

 'scheduler/dequeued': 1,

 'scheduler/dequeued/memory': 1,

 'scheduler/enqueued': 1,

 'scheduler/enqueued/memory': 1,

 'start_time': datetime.datetime(2022, 4, 24, 1, 15, 27, 226670)}

2022-04-24 09:15:40 [scrapy.core.engine] INFO: Spider closed (finished)


Python 全系列/第十六阶段:Python 爬虫开发/scrapy框架使用(旧) 11446楼
Python 全系列/第一阶段:Python入门/控制语句 11447楼
Python 全系列/第一阶段:Python入门/编程基本概念 11448楼
JAVA 全系列/第一阶段:AI驱动的JAVA编程/JAVA入门和背景知识 11449楼
JAVA 全系列/第二阶段:JAVA 基础深化和提高/IO 流技术(旧) 11452楼
JAVA 全系列/第二阶段:JAVA 基础深化和提高/IO 流技术(旧) 11454楼

/**
*Test类
*/
package com.itbaizhan;

public class Test {
    public static void main(String[] args) {
        StatementTest st =new StatementTest();
        st.insertUser("hxf",25);
    }
}

package com.itbaizhan;

import java.sql.Connection;
import java.sql.Statement;

/**
 *  Statement对象的使用
 */

public class StatementTest {
    public void insertUser(String username,int userage){
        Connection connection =null;
        Statement statement =null;
        try {
            //获取Connection对象
            connection=JdbcUtils.getConnection();
            //获取Statement对象
            statement=connection.createStatement();
            //定义需要执行的SQL语句
            String sql ="insert into users values(default,'"+username+"',"+userage+")";
            //执行SQL,返回boolean值,如果sql有结果集返回,那么返回值为true,如果没有结果集返回,则返回false。
            boolean execute =statement.execute(sql);
            System.out.println(execute);

        }catch (Exception e){
            e.printStackTrace();
        }finally {
                JdbcUtils.closeResource(statement,connection);
        }
    }
}



package com.itbaizhan;

import java.io.InputStream;
import java.sql.*;
import java.util.Properties;

/**
 * Jdbc工具类
 */
public class JdbcUtils {
    private static String url;
    private static String name;
    private static String pwd;

    static  {
        try{
            //实例化properties对象
            Properties prop=new Properties();
            //获取读取properties文件的字节输入流对象
            InputStream is = JdbcTest2.class.getClassLoader().getResourceAsStream("jdbc.properties");
            //读取properties文件并解析
            prop.load(is);
            //获取链接数据库的url,用户名,密码
            String url = prop.getProperty("url");
            String name = prop.getProperty("username");
            String pwd = prop.getProperty("pwd");
            //获取数据库驱动全名
            String driver = prop.getProperty("driver");
            //加载并注册驱动
            Class.forName(driver);

        }catch ( Exception e){
            e.printStackTrace();
        }
    }

    //获取数据库链接对象
    public static Connection getConnection(){

        Connection connection = null;
        try {
            connection = DriverManager.getConnection(url,name,pwd);
        } catch (SQLException e) {
            e.printStackTrace();
        }
        return connection;
    }
    //关闭数链接对象
    public static void closeConnection(Connection connection){
        try {
            connection.close();
        } catch (SQLException e) {
            e.printStackTrace();
        }
    }
    //提交事务
    public static void commit(Connection connection){
        try {
            connection.commit();
        } catch (SQLException e) {
            e.printStackTrace();
        }
    }
    //回滚事务
    public static void rollback(Connection connection){
        try {
            connection.rollback();
        } catch (SQLException e) {
            e.printStackTrace();
        }
    }
    //关闭Statement对象
    public static void  closeStatement(Statement statement){
        try {
            statement.close();
        } catch (SQLException e) {
            e.printStackTrace();
        }
    }
    //关闭ResultSet
    public static void closeResultSet(ResultSet resultSet){
        try {
            resultSet.close();
        } catch (SQLException e) {
            e.printStackTrace();
        }
    }
    //DML操作时关闭资源
    public static void closeResource(Statement statement,Connection connection){
        //先关闭Statement对象
        closeStatement(statement);

        //后关闭Connection对象
        closeConnection(connection);
    }
    // //查询时关闭资源
    public static void closeResource(ResultSet resultSet,Statement statement,Connection connection){
        //先关闭ResultSet
        closeResultSet(resultSet);
        //在闭Statement对象
        closeStatement(statement);
        //最后关闭Connection对象
        closeConnection(connection);
    }
}


#连接Mysql数据库的URL
url=jdbc:mysql://localhost:3306/itbz?useSSL=false&useUnicode=true&characterEncoding=utf-8
#连接数据库的用户名
username=root
#连接数据库的密码
pwd=root
#数据库驱动名称
driver=com.mysql.jdbc.Driver


image.png

老师,我代码查了几遍,url也试了,在上节课的案例中不通过工具类链接数据库是不报错的,工具类的代码检查几遍后不懂哪里出错了,您能帮我看看吗


JAVA 全系列/第四阶段:数据库与AI协同技术实战/JDBC技术 11455楼
JAVA 全系列/第八阶段:生产环境部署与协同开发/Linux 11457楼
JAVA 全系列/第二阶段:JAVA 基础深化和提高/数据结构 11458楼
Python 全系列/第一阶段:Python入门/控制语句 11459楼
Python 全系列/第一阶段:Python入门/控制语句 11460楼

课程分类

百战程序员微信公众号

百战程序员微信小程序

©2014-2025百战汇智(北京)科技有限公司 All Rights Reserved 北京亦庄经济开发区科创十四街 赛蒂国际工业园
网站维护:百战汇智(北京)科技有限公司
京公网安备 11011402011233号    京ICP备18060230号-3    营业执照    经营许可证:京B2-20212637