python—找不到任何使用函数从mysql打印数据的方法

kgsdhlau  于 2021-06-18  发布在  Mysql
关注(0)|答案(3)|浏览(325)

我已经用python创建了一个脚本,能够从网页收集数据并将其存储到 mysql . 当数据正确插入 mysql 但是,我的脚本可以在控制台中打印它们。
我的问题是:如何将以下三行 Package 在一个单独的函数中,并从存储器中打印数据?

mycursor.execute("SELECT * FROM webdata")
for item in mycursor.fetchall():
    print(item)

我的完整脚本:

import mysql.connector
from bs4 import BeautifulSoup
import requests

URL = "https://www.tripadvisor.com.au/Restaurants-g255068-c8-Brisbane_Brisbane_Region_Queensland.html"

def get_info(link):
    mydb = mysql.connector.connect(
      host="localhost",
      user="root",
      passwd = "123",
      database="mydatabase"
    )
    mycursor = mydb.cursor()
    mycursor.execute("DROP TABLE if exists webdata")
    mycursor.execute("CREATE TABLE if not exists webdata (name VARCHAR(255), bubble VARCHAR(255), review VARCHAR(255))")

    response = requests.get(link)
    soup = BeautifulSoup(response.text,"lxml")
    for items in soup.find_all(class_="shortSellDetails"):
        name = items.find(class_="property_title").get_text(strip=True)
        bubble = items.find(class_="ui_bubble_rating").get("alt")
        review = items.find(class_="reviewCount").get_text(strip=True)

        mycursor.execute("INSERT INTO webdata (name,bubble,review) VALUES (%s,%s,%s)",(name,bubble,review))
        mydb.commit()

    #I wish to use the follwing three lines within another function to do the same

    mycursor.execute("SELECT * FROM webdata")
    for item in mycursor.fetchall():
        print(item)

if __name__ == '__main__':
    get_info(URL)
gzjq41n4

gzjq41n41#

您考虑过使用django模型框架吗?它允许您将任何表作为python对象进行交互。
在现有的数据库中,您可以将其转换为此处所述的模型,这只是在django设置文件中设置数据库的问题

DATABASES = {
'default': {
    'ENGINE': 'django.db.backends.mysql', 
    'NAME': 'DB_NAME',
    'USER': 'DB_USER',
    'PASSWORD': 'DB_PASSWORD',
    'HOST': 'localhost',   # Or an IP Address that your DB is hosted on
    'PORT': '3306',
    }
}

然后可以使用django操作从表中创建模型

$ python manage.py inspectdb > models.py

然后,您将在您的 models.py 然后可以访问的文件。
您将得到这样的对象(不太了解数据库),您只需添加一个 __str__() 句柄,以便在打印对象时正确格式化:

from django.db import models

class ShortSellDetails(models.Model):
    id = models.UUIDField(default=uuid.uuid4, editable=False, unique=True)
    name = models.CharField()
    bubble = models.CharField()
    review = models.TextField()

    def __str__(self):
        return "VALUES ({0},{1}, {2})".format(
             self.name,self.bubble,self.review)

然后可以像所有模型一样与该对象交互(请确保将模型添加到 INSTALLED APPS 例如。 shortSell 和表演 ./manage.py makemigrations & ./manage.py migrate )例如:

from shortSell import ShortSellDetails

details = ShortSellDetails.object.all()
for detail in details:
    print(detail)

或者您也可以直接使用模型接口创建对象

from bs4 import BeautifulSoup
import requests
from shortSell import ShortSellDetails

URL = "https://www.tripadvisor.com.au/Restaurants-g255068-c8-Brisbane_Brisbane_Region_Queensland.html"
response = requests.get(URL)
soup = BeautifulSoup(response.text,"lxml")

for items in soup.find_all(class_="shortSellDetails"):
        name = items.find(class_="property_title").get_text(strip=True)
        bubble = items.find(class_="ui_bubble_rating").get("alt")
        review = items.find(class_="reviewCount").get_text(strip=True)

        ShortSellDetails.objects.create(
            name=name, 
            bubble=bubble, 
            review=review)

details = ShortSellDetails.object.all()
for detail in details:
    print(detail)
igetnqfo

igetnqfo2#

我想下面是你如何把东西分成不同的功能,并按要求打印数据。确保像这里建议的那样在for循环之外提交。

import mysql.connector
from bs4 import BeautifulSoup
import requests

url = "https://www.tripadvisor.com.au/Restaurants-g255068-c8-Brisbane_Brisbane_Region_Queensland.html"

def connect():
    mydb = mysql.connector.connect(
      host="localhost",
      user="root",
      passwd = "123",
      database="mydatabase"
    )
    return mydb

def create_table(link):
    conn = connect()
    mycursor = conn.cursor()
    mycursor.execute("DROP TABLE if exists webdata")
    mycursor.execute("CREATE TABLE if not exists webdata (name VARCHAR(255), bubble VARCHAR(255), review VARCHAR(255))")

    response = requests.get(link)
    soup = BeautifulSoup(response.text,"lxml")
    for items in soup.find_all(class_="shortSellDetails"):
        name = items.find(class_="property_title").get_text(strip=True)
        bubble = items.find(class_="ui_bubble_rating").get("alt")
        review = items.find(class_="reviewCount").get_text(strip=True)
        mycursor.execute("INSERT INTO webdata (name,bubble,review) VALUES (%s,%s,%s)",(name,bubble,review))
    conn.commit() #this is where you should commit

def fetch_data():
    conn = connect()
    mycursor = conn.cursor()
    mycursor.execute("SELECT * FROM webdata")
    for item in mycursor.fetchall():
        print(item)

if __name__ == '__main__':
    create_table(url)
    fetch_data()
ftf50wuq

ftf50wuq3#

我的评论是为了描述这个改变了的代码:

import mysql.connector
from bs4 import BeautifulSoup
import requests

# This is a new function

def do_fetch (cursor):
    cursor.execute("SELECT * FROM webdata")
    for item in cursor.fetchall():
        print(item)

URL = "https://www.tripadvisor.com.au/Restaurants-g255068-c8-Brisbane_Brisbane_Region_Queensland.html"

def get_info(link):
    mydb = mysql.connector.connect(
      host="localhost",
      user="root",
      passwd = "123",
      database="mydatabase"
    )
    mycursor = mydb.cursor()
    mycursor.execute("DROP TABLE if exists webdata")
    mycursor.execute("CREATE TABLE if not exists webdata (name VARCHAR(255), bubble VARCHAR(255), review VARCHAR(255))")

    response = requests.get(link)
    soup = BeautifulSoup(response.text,"lxml")
    for items in soup.find_all(class_="shortSellDetails"):
        name = items.find(class_="property_title").get_text(strip=True)
        bubble = items.find(class_="ui_bubble_rating").get("alt")
        review = items.find(class_="reviewCount").get_text(strip=True)

        mycursor.execute("INSERT INTO webdata (name,bubble,review) VALUES (%s,%s,%s)",(name,bubble,review))
        mydb.commit()

    # This part is changed (moved to new function):
    do_fetch (mycursor)

if __name__ == '__main__':
    get_info(URL)

相关问题