First, generate queue.xml file in WEB-INF has content :

Normal, when you crawl knowledge, at the time, you need send a request to crawl, sequential. & wait webpage to response result. in the event you need to do this working more several times, you need to care their result & it takes a long time to response these working completed . To Resolve these issues, They use Task Queues. They only list our working to resolve. It work automatic & you don't care how to long time it is completed. Example, when you resgister on a forum, you don't wait webpage send e-mail to complete register proccessing, you can view immediate these topics of forum
<!��Change the refresh rate of the default queue to 1/s��>
<name>default</name> //name of task queue will call in servlet
<rate>100/s</rate> //several times use task queues in 1s
generate one servlet to add task queues(
package Showsiteinfo.cron;
import javax.jdo.PersistenceManager;
import javax.jdo.Query;
import javax.servlet.http.HttpServlet;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
import static*;
public class DemoServlet extends HttpServlet {
public void doGet(HttpServletRequest req, HttpServletResponse resp)
throws IOException {
for(int i=0;i<100;i++)
Queue queue = QueueFactory.getDefaultQueue();  //call default task queues was config in queue.xml file.
queue.add(withUrl(“/queue”).method(Method.GET).param(“name”, “abc”));  // call url queue to use method get with parameter name=abc
Go to web.xml to edit path is "/demo"
Now, we run link "/demo", so, 1s(maybe more 1s), we did 100 request to url/queue.
Limit of Google : don't allow to us call links from a domain or other app, so only call in itself in app that you are do task queues.

Last edited by shabbir; 15Jun2012 at 21:43.. Reason: Code blocks