Warum muss ich javax.- servlet.UnavailableException: CrawlServlet für meinen Filter?

ok, hier ist was ich versuche.

Erstellte ich eine Klasse namens CrawlServlet im server-Paket.

import java.io.IOException;

import javax.servlet.Filter;
import javax.servlet.FilterChain;
import javax.servlet.FilterConfig;
import javax.servlet.ServletException;
import javax.servlet.ServletRequest;
import javax.servlet.ServletResponse;
import javax.servlet.http.HttpServletRequest;


public class CrawlServlet implements Filter{



 @Override
 public void destroy() {
 //TODO Auto-generated method stub

 }


 @Override
 public void doFilter(ServletRequest request, ServletResponse response,
 FilterChain chain) throws IOException, ServletException {
 //TODO Auto-generated method stub
 HttpServletRequest httpRequest = (HttpServletRequest) request;
 String requestURI = httpRequest.getRequestURI();
      if ((requestURI != null) && (requestURI.contains("_escaped_fragment_"))) {
       System.out.println(requestURI);
     } else {
      try {
        //not an _escaped_fragment_ URL, so move up the chain of servlet (filters)
        chain.doFilter(request, response);
      } catch (ServletException e) {
        System.err.println("Servlet exception caught: " + e);
        e.printStackTrace();
      }
    }

 }


 @Override
 public void init(FilterConfig arg0) throws ServletException {
 //TODO Auto-generated method stub

 }
}

in lib/web.xml ich habe

  <filter>
     <filter-name>CrawlServlet</filter-name>
     <filter-class>CrawlServlet</filter-class>
  </filter>


  <filter-mapping>
     <filter-name>CrawlServlet</filter-name>
     <url-pattern>/*</url-pattern>
  </filter-mapping>

Nach lief, Igot dieser Fehler:

Starting Jetty on port 8888
[WARN] 
java.lang.ClassNotFoundException: CrawlServlet
at java.lang.ClassLoader.findClass(Unknown Source)
    at java.lang.ClassLoader.loadClass(Unknown Source)
    at java.lang.ClassLoader.loadClass(Unknown Source)
.....
[WARN] FAILED CrawlServlet: javax.servlet.UnavailableException: CrawlServlet
javax.servlet.UnavailableException: CrawlServlet
....
[ERROR] 503 - GET /Myproject.html?gwt.codesvr=127.0.0.1:9997 (127.0.0.1) 1299 bytes
   Request headers
      Accept: text/html, application/xhtml+xml, */*
      Accept-Language: en-AU
      User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64; Trident/7.0; rv:11.0) like Gecko
      Accept-Encoding: gzip, deflate
      Host: 127.0.0.1:8888
      If-Modified-Since: Wed, 16 Apr 2014 00:35:41 GMT
      Connection: keep-alive
   Response headers
      Cache-Control: must-revalidate,no-cache,no-store
      Content-Type: text/html;charset=ISO-8859-1
      Content-Length: 1299

Was ist falsch?

Können Sie dieses problem beheben?

InformationsquelleAutor Tum | 2014-05-16
Schreibe einen Kommentar