我是新的jquery请建议我一个简单易学的代码,用于查找外部网站中的所有链接,而无需实际使用JQuery的代码,我正在尝试操作这个urls = $$('a'); for (url in urls) console.log(urls[url].href);,但它的工作原理,只有当我们在我们想要获得链接的页面上,我也可以更喜欢php,但不是那么复杂或更简单,我想说如何使用jQuery或超文本预处理器[PHP]创建Web Spider就像Google做的一样。
1. https://code.google.com/p/phpquery/downloads/list - phpQuery-onefile and extract it to public directory. for example create a folder "parsers" in public and put it there.
2. create get-urls.php file and put it to parsers directory in public directory (near with phpquery)
<?php
require_once('phpQuery-onefile.php');
$document = phpQuery::newDocumentFileHTML($_GET['url'], $charset = 'utf-8');
$links = $document->find('a');
$result = array();
foreach ($links as $link) {
$href = pq($link)->attr('href');
$result[] = $href;
}
echo json_encode($result);
3. at clientside (on page where you want to get links) call your serverside script and pass your url and get answer
<script>
$(function(){
$.get('http://yourserver.com/parsers/get-urls.php', {'url': 'some_url_here'}, function(response){
response = $.parseJSON(response);
for(var r in response) {
var link = response[r];
console.log(link);
}
});
});
</script>
1条答案
按热度按时间au9on6nz1#
你不能不访问.至少你必须在服务器端用CURL解析页面内容并将其回显到客户端浏览器.你可以使用phpQuery从html内容中获取所有链接.
字符串