1
I’m trying to scrape a form, to insert an attachment and submit, using Robobrowser.
To open the page I do:
browser.open('url')
To get the form I make:
form = browser.get_form(id='id_form')
To enter the data into the form I do:
form['data_dia'] = '25' # por exemplo
To submit the form I do:
browser.submit_form(form, form['btnEnviar'])
or just
browser.submit_form(form)
But this is not working, the form is not being sent. While trying to fetch all inputs gives page, I found that the send button is not coming by Robobrowser.
making,
todos_inputs = browser.find_all('input')
for t in todos_inputs:
print(t)
do not get the input tag with id 'btnEnviar', which in the html code of the page is inside the form. The other inputs of the form are coming, like 'day', 'month' and 'year', for example.
I didn’t post the html code because it needs login and password for access.
The problem is that Robobrowser is not able to scrape all the html information, just a part, making me unable to submit the form. Is there a solution to this? Or there is another way to fill out a form and send it with other tools except Robobrowser and Beautifulsoup?