会员可以在此提问,百战程序员老师有问必答
对大家有帮助的问答会被标记为“推荐”
看完课程过来浏览一下别人提的问题,会帮你学得更全面
截止目前,同学们一共提了 132487个问题
JAVA 全系列/第四阶段:网页编程和设计/Javascript 语言(旧) 14371楼
JAVA 全系列/第五阶段:JavaWeb开发/JSP技术详解(旧) 14372楼
Python 全系列/第七阶段:网页编程基础/小米官网项目 14373楼
人工智能/第八阶段:机器学习-决策树系列/GBDT 14374楼
JAVA 全系列/第八阶段:Linux入门到实战/Linux(旧) 14375楼
Python 全系列/第五阶段:数据库编程/python操作mysql(旧) 14376楼

D:\python文件\爬虫\demo\first_scrapy>scrapy crawl baidu

2020-05-26 18:03:37 [scrapy.utils.log] INFO: Scrapy 2.1.0 started (bot: first_scrapy)

2020-05-26 18:03:37 [scrapy.utils.log] INFO: Versions: lxml 4.5.1.0, libxml2 2.9.5, cssselect 1.1.0, parsel 1.6.0, w3lib 1.22.0, Twisted 20.3.0, Python 3.7.6 (tags/v3.7.6:43364a7ae0, De

c 19 2019, 00:42:30) [MSC v.1916 64 bit (AMD64)], pyOpenSSL 19.1.0 (OpenSSL 1.1.1g  21 Apr 2020), cryptography 2.9.2, Platform Windows-10-10.0.17134-SP0

2020-05-26 18:03:37 [scrapy.utils.log] DEBUG: Using reactor: twisted.internet.selectreactor.SelectReactor

2020-05-26 18:03:37 [scrapy.crawler] INFO: Overridden settings:

{'BOT_NAME': 'first_scrapy',

 'NEWSPIDER_MODULE': 'first_scrapy.spiders',

 'ROBOTSTXT_OBEY': True,

 'SPIDER_MODULES': ['first_scrapy.spiders']}

2020-05-26 18:03:37 [scrapy.extensions.telnet] INFO: Telnet Password: 2dc378fbfcafc19b

2020-05-26 18:03:37 [scrapy.middleware] INFO: Enabled extensions:

['scrapy.extensions.corestats.CoreStats',

 'scrapy.extensions.telnet.TelnetConsole',

 'scrapy.extensions.logstats.LogStats']

2020-05-26 18:03:37 [scrapy.middleware] INFO: Enabled downloader middlewares:

['scrapy.downloadermiddlewares.robotstxt.RobotsTxtMiddleware',

 'scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware',

 'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware',

 'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware',

 'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware',

 'scrapy.downloadermiddlewares.retry.RetryMiddleware',

 'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware',

 'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware',

 'scrapy.downloadermiddlewares.redirect.RedirectMiddleware',

 'scrapy.downloadermiddlewares.cookies.CookiesMiddleware',

 'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware',

 'scrapy.downloadermiddlewares.stats.DownloaderStats']

2020-05-26 18:03:37 [scrapy.middleware] INFO: Enabled spider middlewares:

['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware',

 'scrapy.spidermiddlewares.offsite.OffsiteMiddleware',

 'scrapy.spidermiddlewares.referer.RefererMiddleware',

 'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware',

 'scrapy.spidermiddlewares.depth.DepthMiddleware']

2020-05-26 18:03:37 [scrapy.middleware] INFO: Enabled item pipelines:

[]

2020-05-26 18:03:37 [scrapy.core.engine] INFO: Spider opened

2020-05-26 18:03:37 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min)

2020-05-26 18:03:37 [scrapy.extensions.telnet] INFO: Telnet console listening on 127.0.0.1:6023

2020-05-26 18:03:37 [scrapy.core.engine] DEBUG: Crawled (200) <GET http://baidu.com/robots.txt> (referer: None)

2020-05-26 18:03:37 [scrapy.downloadermiddlewares.robotstxt] DEBUG: Forbidden by robots.txt: <GET http://baidu.com/>

2020-05-26 18:03:38 [scrapy.core.engine] INFO: Closing spider (finished)

2020-05-26 18:03:38 [scrapy.statscollectors] INFO: Dumping Scrapy stats:

{'downloader/exception_count': 1,

 'downloader/exception_type_count/scrapy.exceptions.IgnoreRequest': 1,

 'downloader/request_bytes': 219,

 'downloader/request_count': 1,

 'downloader/request_method_count/GET': 1,

 'downloader/response_bytes': 2680,

 'downloader/response_count': 1,

 'downloader/response_status_count/200': 1,

 'elapsed_time_seconds': 0.291164,

 'finish_reason': 'finished',

 'finish_time': datetime.datetime(2020, 5, 26, 10, 3, 38, 88538),

 'log_count/DEBUG': 2,

 'log_count/INFO': 10,

 'response_received_count': 1,

 'robotstxt/forbidden': 1,

 'robotstxt/request_count': 1,

 'robotstxt/response_count': 1,

 'robotstxt/response_status_count/200': 1,

 'scheduler/dequeued': 1,

 'scheduler/dequeued/memory': 1,

 'scheduler/enqueued': 1,

 'scheduler/enqueued/memory': 1,

 'start_time': datetime.datetime(2020, 5, 26, 10, 3, 37, 797374)}

2020-05-26 18:03:38 [scrapy.core.engine] INFO: Spider closed (finished)


我也是按照视频爬百度,没有返回html

Python 全系列/第十五阶段:Python 爬虫开发/移动端爬虫开发- 14377楼

image.png

JAVA 全系列/第六阶段:项目管理与SSM框架/Maven 14378楼
JAVA 全系列/第六阶段:项目管理与SSM框架/Spring旧 14381楼

image.png

import random
import string

from django.db import models

# Create your models here.


class Student(models.Model):

    name = models.CharField(max_length=64)
    age = models.IntegerField(default=18)
    sex = models.IntegerField(choices=((1, '男'), (2, '女')), default=1)
    card = models.CharField(max_length=128)

    @classmethod
    def insert_test_data(self,num):
        """
        批量生成测试数据
        :param num: 记录数
        :return:
        """
        def random_str(row_ite, length):
            """
            :param row_ite: 从指定的可迭代的原始数据中获取字符
            :param length:  字符串的长度
            :return:  随机的字符串
            """
            return random.choices(row_ite, k=length)
        obj_li = []
        for _ in range(num):
            obj_li.append(Student(
                name = random_str(string.ascii_lowercase, random.randint(6,10)),
                age = random.randint(18,26),
                sex = random.choice([1,2]),
                card = random_str(string.digits,18)
            ))

        Student.objects.bulk_create(obj_li)

老师,我在运行上述代码插入数据的时候出现了上图中的错误,修改了类中的max_length也没用,求教。。。

Python 全系列/第十二阶段:Python_Django3框架/Django初级 14382楼
Python 全系列/第一阶段:Python入门/Python入门(动画版) 14383楼
Python 全系列/第五阶段:数据库编程/mysql的使用 14384楼
Python 全系列/第二阶段:Python 深入与提高/文件处理 14385楼

课程分类

百战程序员微信公众号

百战程序员微信小程序

©2014-2025百战汇智(北京)科技有限公司 All Rights Reserved 北京亦庄经济开发区科创十四街 赛蒂国际工业园
网站维护:百战汇智(北京)科技有限公司
京公网安备 11011402011233号    京ICP备18060230号-3    营业执照    经营许可证:京B2-20212637