会员可以在此提问,百战程序员老师有问必答
对大家有帮助的问答会被标记为“推荐”
看完课程过来浏览一下别人提的问题,会帮你学得更全面
截止目前,同学们一共提了 133787个问题
JAVA 全系列/第十一阶段:百战旅游网项目/百战旅游网 3901楼
JAVA 全系列/(旧的隐藏)第十二阶段:spring全家桶(Spring Cloud)/Spring Cloud 3902楼

15.1.8 安装 Webpack.zip

老师,在视频4分14秒处讲课老师打开的网址在我这打开就找不到了,不知道为啥,麻烦老师帮我看看

image.png


 image.png

WEB前端全系列/第十二阶段:前端工程化(旧)/Webpack 3903楼
JAVA 全系列/第十九阶段:亿级高并发电商项目/亿级高并发电商项目(旧) 3904楼
JAVA 全系列/第十二阶段:消息中间件与高并发处理/RabbitMQ(旧) 3906楼
JAVA 全系列/第一阶段:AI驱动的JAVA编程/IDEA的使用和第一个java项目 3907楼

老师,我这个是?

PS D:\vscodeproject2\爬虫\Scarpy\scarpy05> & D:/python_env/spider2_env_/Scripts/Activate.ps1

(spider2_env_) PS D:\vscodeproject2\爬虫\Scarpy\scarpy05> & D:/python_env/spider2_env_/Scripts/python.exe d:/vscodeproject2/爬虫/Scarpy/scarpy05/scarpy05/begin.py

2023-12-26 22:01:30 [scrapy.utils.log] INFO: Scrapy 2.6.1 started (bot: scarpy05)

2023-12-26 22:01:30 [scrapy.utils.log] INFO: Versions: lxml 4.8.0.0, libxml2 2.9.12, cssselect 1.1.0, parsel 1.6.0, w3lib 1.22.0, Twisted 22.4.0, Python 3.10.0 (tags/v3.10.0:b494f59, Oct  4 2021, 19:00:18) [MSC v.1929 64 bit (AMD64)], pyOpenSSL 22.0.0 (OpenSSL 1.1.1n  15 Mar 2022), cryptography 36.0.2, Platform Windows-10-10.0.19045-SP0

2023-12-26 22:01:30 [scrapy.crawler] INFO: Overridden settings:

{'BOT_NAME': 'scarpy05',

 'NEWSPIDER_MODULE': 'scarpy05.spiders',

 'SPIDER_MODULES': ['scarpy05.spiders']}

2023-12-26 22:01:30 [scrapy.utils.log] DEBUG: Using reactor: twisted.internet.selectreactor.SelectReactor

2023-12-26 22:01:30 [scrapy.extensions.telnet] INFO: Telnet Password: 2e9c559873783f27

2023-12-26 22:01:30 [scrapy.middleware] INFO: Enabled extensions:

['scrapy.extensions.corestats.CoreStats',

 'scrapy.extensions.telnet.TelnetConsole',

 'scrapy.extensions.logstats.LogStats']

2023-12-26 22:01:30 [scrapy.middleware] INFO: Enabled downloader middlewares:

['scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware',

 'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware',   

 'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware',

 'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware',

 'scrapy.downloadermiddlewares.retry.RetryMiddleware',

 'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware',

 'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware',

 'scrapy.downloadermiddlewares.redirect.RedirectMiddleware',

 'scrapy.downloadermiddlewares.cookies.CookiesMiddleware',

 'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware',

 'scrapy.downloadermiddlewares.stats.DownloaderStats']

2023-12-26 22:01:30 [scrapy.middleware] INFO: Enabled spider middlewares:

['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware',

 'scrapy.spidermiddlewares.offsite.OffsiteMiddleware',

 'scrapy.spidermiddlewares.referer.RefererMiddleware',

 'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware',

 'scrapy.spidermiddlewares.depth.DepthMiddleware']

Unhandled error in Deferred:

2023-12-26 22:01:30 [twisted] CRITICAL: Unhandled error in Deferred:


Traceback (most recent call last):

  File "D:\python_env\spider2_env_\lib\site-packages\scrapy\crawler.py", line 206, in crawl

    return self._crawl(crawler, *args, **kwargs)

  File "D:\python_env\spider2_env_\lib\site-packages\scrapy\crawler.py", line 210, in _crawl

    d = crawler.crawl(*args, **kwargs)

  File "D:\python_env\spider2_env_\lib\site-packages\twisted\internet\defer.py", line 1905, in unwindGenerator

    return _cancellableInlineCallbacks(gen)

  File "D:\python_env\spider2_env_\lib\site-packages\twisted\internet\defer.py", line 1815, in _cancellableInlineCallbacks

    _inlineCallbacks(None, gen, status)

--- <exception caught here> ---

  File "D:\python_env\spider2_env_\lib\site-packages\twisted\internet\defer.py", line 1660, in _inlineCallbacks

    result = current_context.run(gen.send, result)

  File "D:\python_env\spider2_env_\lib\site-packages\scrapy\crawler.py", line 102, in crawl

    self.engine = self._create_engine()

  File "D:\python_env\spider2_env_\lib\site-packages\scrapy\crawler.py", line 116, in _create_engine

    return ExecutionEngine(self, lambda _: self.stop())

  File "D:\python_env\spider2_env_\lib\site-packages\scrapy\core\engine.py", line 84, in __init__

    self.scraper = Scraper(crawler)

  File "D:\python_env\spider2_env_\lib\site-packages\scrapy\core\scraper.py", line 75, in __init__

    self.itemproc = itemproc_cls.from_crawler(crawler)

  File "D:\python_env\spider2_env_\lib\site-packages\scrapy\middleware.py", line 59, in from_crawler

    return cls.from_settings(crawler.settings, crawler)

  File "D:\python_env\spider2_env_\lib\site-packages\scrapy\middleware.py", line 41, in from_settings

    mw = create_instance(mwcls, settings, crawler)

  File "D:\python_env\spider2_env_\lib\site-packages\scrapy\utils\misc.py", line 166, in create_instance

    instance = objcls.from_crawler(crawler, *args, **kwargs)

  File "D:\python_env\spider2_env_\lib\site-packages\scrapy\pipelines\media.py", line 76, in from_crawler

    pipe = cls.from_settings(crawler.settings)

  File "D:\python_env\spider2_env_\lib\site-packages\scrapy\pipelines\images.py", line 112, in from_settings

    return cls(store_uri, settings=settings)

  File "D:\python_env\spider2_env_\lib\site-packages\scrapy\pipelines\images.py", line 55, in __init__

    super().__init__(store_uri, settings=settings, download_func=download_func)

  File "D:\python_env\spider2_env_\lib\site-packages\scrapy\pipelines\files.py", line 329, in __init__

    self.store = self._get_store(store_uri)

  File "D:\python_env\spider2_env_\lib\site-packages\scrapy\pipelines\files.py", line 378, in _get_store

    store_cls = self.STORE_SCHEMES[scheme]

builtins.KeyError: 'd'


2023-12-26 22:01:30 [twisted] CRITICAL:

Traceback (most recent call last):

  File "D:\python_env\spider2_env_\lib\site-packages\twisted\internet\defer.py", line 1660, in _inlineCallbacks

    result = current_context.run(gen.send, result)

  File "D:\python_env\spider2_env_\lib\site-packages\scrapy\crawler.py", line 102, in crawl

    self.engine = self._create_engine()

  File "D:\python_env\spider2_env_\lib\site-packages\scrapy\crawler.py", line 116, in _create_engine

    return ExecutionEngine(self, lambda _: self.stop())

  File "D:\python_env\spider2_env_\lib\site-packages\scrapy\core\engine.py", line 84, in __init__

    self.scraper = Scraper(crawler)

  File "D:\python_env\spider2_env_\lib\site-packages\scrapy\core\scraper.py", line 75, in __init__

    self.itemproc = itemproc_cls.from_crawler(crawler)

  File "D:\python_env\spider2_env_\lib\site-packages\scrapy\middleware.py", line 59, in from_crawler

    return cls.from_settings(crawler.settings, crawler)

  File "D:\python_env\spider2_env_\lib\site-packages\scrapy\middleware.py", line 41, in from_settings

    mw = create_instance(mwcls, settings, crawler)

  File "D:\python_env\spider2_env_\lib\site-packages\scrapy\utils\misc.py", line 166, in create_instance

    instance = objcls.from_crawler(crawler, *args, **kwargs)

  File "D:\python_env\spider2_env_\lib\site-packages\scrapy\pipelines\media.py", line 76, in from_crawler

    pipe = cls.from_settings(crawler.settings)

  File "D:\python_env\spider2_env_\lib\site-packages\scrapy\pipelines\images.py", line 112, in from_settings

    return cls(store_uri, settings=settings)

  File "D:\python_env\spider2_env_\lib\site-packages\scrapy\pipelines\images.py", line 55, in __init__

    super().__init__(store_uri, settings=settings, download_func=download_func)

  File "D:\python_env\spider2_env_\lib\site-packages\scrapy\pipelines\files.py", line 329, in __init__

    self.store = self._get_store(store_uri)

  File "D:\python_env\spider2_env_\lib\site-packages\scrapy\pipelines\files.py", line 378, in _get_store

    store_cls = self.STORE_SCHEMES[scheme]

KeyError: 'd'

(spider2_env_) PS D:\vscodeproject2\爬虫\Scarpy\scarpy05>


Python 全系列/第十六阶段:Python 爬虫开发/scrapy框架使用 3908楼

PlaneGame0.8.zip

老师,请问为什么我运行的时候会报这样的错误,我的是JDK1.8,在没有用数组生成50个炮弹之前也会出现类似的错误,但是我把双缓冲的代码写到main方法之后就好了,又报这样的错误是什么原因?而且第45行代码并没有报错

Exception in thread "AWT-EventQueue-0" java.lang.NullPointerException

at cn.sxt.game.MyGameFrame.paint(MyGameFrame.java:45)

at sun.awt.RepaintArea.paintComponent(Unknown Source)

at sun.awt.RepaintArea.paint(Unknown Source)

at sun.awt.windows.WComponentPeer.handleEvent(Unknown Source)

at java.awt.Component.dispatchEventImpl(Unknown Source)

at java.awt.Container.dispatchEventImpl(Unknown Source)

at java.awt.Window.dispatchEventImpl(Unknown Source)

at java.awt.Component.dispatchEvent(Unknown Source)

at java.awt.EventQueue.dispatchEventImpl(Unknown Source)

at java.awt.EventQueue.access$500(Unknown Source)

at java.awt.EventQueue$3.run(Unknown Source)

at java.awt.EventQueue$3.run(Unknown Source)

at java.security.AccessController.doPrivileged(Native Method)

at java.security.ProtectionDomain$JavaSecurityAccessImpl.doIntersectionPrivilege(Unknown Source)

at java.security.ProtectionDomain$JavaSecurityAccessImpl.doIntersectionPrivilege(Unknown Source)

at java.awt.EventQueue$4.run(Unknown Source)

at java.awt.EventQueue$4.run(Unknown Source)

at java.security.AccessController.doPrivileged(Native Method)

at java.security.ProtectionDomain$JavaSecurityAccessImpl.doIntersectionPrivilege(Unknown Source)

at java.awt.EventQueue.dispatchEvent(Unknown Source)

at java.awt.EventDispatchThread.pumpOneEventForFilters(Unknown Source)

at java.awt.EventDispatchThread.pumpEventsForFilter(Unknown Source)

at java.awt.EventDispatchThread.pumpEventsForHierarchy(Unknown Source)

at java.awt.EventDispatchThread.pumpEvents(Unknown Source)

at java.awt.EventDispatchThread.pumpEvents(Unknown Source)

at java.awt.EventDispatchThread.run(Unknown Source)


JAVA 全系列/第一阶段:AI驱动的JAVA编程/飞机大战小项目训练 3909楼
Python 全系列/第一阶段:AI驱动的Python编程/Python入门(动画版) 3910楼
Python 全系列/第四阶段:函数式编程和核心特性/内存管理 3911楼
JAVA 全系列/(旧的隐藏)第七阶段:JAVA 高级技术/MyCat 3912楼
JAVA 全系列/第一阶段:AI驱动的JAVA编程/飞机大战小项目训练 3913楼
Python 全系列/第六阶段:数据库与AI协同技术实战/mysql的使用 3914楼

package DOM方式;


import java.io.IOException;


import javax.xml.parsers.DocumentBuilder;

import javax.xml.parsers.DocumentBuilderFactory;

import javax.xml.parsers.ParserConfigurationException;


import org.w3c.dom.Document;

import org.w3c.dom.NamedNodeMap;

import org.w3c.dom.Node;

import org.w3c.dom.NodeList;

import org.xml.sax.SAXException;





public class TesTDDMParse {

public static void main(String[] args) throws ParserConfigurationException, SAXException, IOException {

//(1) 创建一个DocumentBuilderFactory的对象

DocumentBuilderFactory dbf=DocumentBuilderFactory.newInstance();

//(2)创建DocumenyBuilder对象

DocumentBuilder db=dbf.newDocumentBuilder();

//(3)通过DocumentBuilder的parse(. . .)方法得到Document对象

Document doc=db.parse("book.xml");

//(4)通过getElementsByTagName(. . .)方法获取到节点的列表

NodeList bookList=doc.getElementsByTagName("book");

System.out.println(bookList.getLength());

//(5)通过for循环遍历,每一个节点

for(int i=0; i<bookList.getLength();i++) {

//(6)得到每一个节点的属性和属性值

Node book=bookList.item(i);

NamedNodeMap attrs=book.getAttributes(); //得到属性的集合

//遍历每一个属性

for(int j=0;j<attrs.getLength();j++){

//得到每一个属性

Node id=attrs.item(j);

System.out.println("属性的名称"+id.getNodeName()+"\t"+id.getNodeValue());

}

}

System.out.println("\n每个节点的名称和节点的值");

//(7)得到每个节点名和节点值

for(int i=0;i<bookList.getLength();i++){

//得到每一个book节点

Node book=bookList.item(i);

NodeList subNode=book.getChildNodes();

//使用for循环遍历每一个book的子节点

for(int j=0;j<subNode.getLength();j++) {

Node childNode=subNode.item(j);

//System.out.println(childNode.getNodeName());

short type=childNode.getNodeType();//获取节点的类型

if(type==childNode.getNodeType()) {

System.out.println("节点的名称:"+childNode.getNodeName()+"\t"+childNode.getTextContent());

}

}

}

}

}



这是books

<?xml version="1.0" encoding="UTF-8"?>

<books>

<book id="1001">

<name>Java开发</name>

<author>张小三</author>

<price>98.5</price>

</book>

<book id="1002">

<name>李四</name>

<author>Java</author>

<price>99.2</price>

</book>

</books>



这是错误

Exception in thread "main" java.io.FileNotFoundException: D:\java\第二阶段\XML技术\books.xml (系统找不到指定的文件。)

at java.base/java.io.FileInputStream.open0(Native Method)

at java.base/java.io.FileInputStream.open(FileInputStream.java:213)

at java.base/java.io.FileInputStream.<init>(FileInputStream.java:155)

at java.base/java.io.FileInputStream.<init>(FileInputStream.java:110)

at java.base/sun.net.www.protocol.file.FileURLConnection.connect(FileURLConnection.java:86)

at java.base/sun.net.www.protocol.file.FileURLConnection.getInputStream(FileURLConnection.java:184)

at java.xml/com.sun.org.apache.xerces.internal.impl.XMLEntityManager.setupCurrentEntity(XMLEntityManager.java:654)

at java.xml/com.sun.org.apache.xerces.internal.impl.XMLVersionDetector.determineDocVersion(XMLVersionDetector.java:150)

at java.xml/com.sun.org.apache.xerces.internal.parsers.XML11Configuration.parse(XML11Configuration.java:860)

at java.xml/com.sun.org.apache.xerces.internal.parsers.XML11Configuration.parse(XML11Configuration.java:824)

at java.xml/com.sun.org.apache.xerces.internal.parsers.XMLParser.parse(XMLParser.java:141)

at java.xml/com.sun.org.apache.xerces.internal.parsers.DOMParser.parse(DOMParser.java:246)

at java.xml/com.sun.org.apache.xerces.internal.jaxp.DocumentBuilderImpl.parse(DocumentBuilderImpl.java:339)

at java.xml/javax.xml.parsers.DocumentBuilder.parse(DocumentBuilder.java:178)

at DOM方式.TesTDDMParse.main(TesTDDMParse.java:25)




blob.png



JAVA 全系列/第二阶段:JAVA 基础深化和提高/XML 技术(旧) 3915楼

课程分类

百战程序员微信公众号

百战程序员微信小程序

©2014-2025百战汇智(北京)科技有限公司 All Rights Reserved 北京亦庄经济开发区科创十四街 赛蒂国际工业园
网站维护:百战汇智(北京)科技有限公司
京公网安备 11011402011233号    京ICP备18060230号-3    营业执照    经营许可证:京B2-20212637