You are here

tvtv.us.ini fix

55 posts / 0 new
Last post
bizzs
Offline
Donator
Joined: 5 years
Last seen: 11 months
tvtv.us.ini fix

Hi, does anyone have the new ini for tvtv.us.ini? I am receiving error tvtv.us does not allow epg grabbing. I have not had any problems with this ini for a long time and it just popped up yesterday.

Thanks
bizzs

Blackbear199
Offline
Blackbear199's picture
WG++ Team memberDonator
Joined: 9 years
Last seen: 22 hours

look under the FAQ section at the top of the page.

dariofol
Offline
Donator
Joined: 5 years
Last seen: 8 months

Me too have problem with tvtv.us/tvtv.ca site ini

I dont understand the problem.
i run multiple configuration, seem that in the home folder tvtv.us works, but when i use another configuration the same ini dont work
i try instead the tvtv.ca in the main folder and are unable to grab

dariofol
Offline
Donator
Joined: 5 years
Last seen: 8 months

actualy in the main folder the tvtv.us works... but when i put in another configuration are unable to grab...
i exclude so an IP block.
could be due to cookies or something that keeps track of the grab?

i have this problem yesterday on the tvtv.us.ini in main folder, i update the webgrab to the 2.1.10 after seemed to resolve, but today i have issue on other similar configuration (/EPG/configuration)

Attachments: 
Blackbear199
Offline
Blackbear199's picture
WG++ Team memberDonator
Joined: 9 years
Last seen: 22 hours

if your using the run icon that's your problem as it looks in the default locations only.
to run webgrab with files in any other location you need to do it on command line or create a bat file.

/path/to/webgrab.exe /path/to/wg_config.xml

if your path has spaces put it in quotes.

dariofol
Offline
Donator
Joined: 5 years
Last seen: 8 months

Thank for support... But unfortunaly i do what you say... "C:\Program Files (x86)\WebGrab+Plus\bin\WebGrab+Plus.exe" C:\Users\Administrator\AppData\Local\WebGrab+Plus\Guide_US_PACIFIC until yesterday works so something change...

Blackbear199
Offline
Blackbear199's picture
WG++ Team memberDonator
Joined: 9 years
Last seen: 22 hours

in this location..

C:\Users\Administrator\AppData\Local\WebGrab+Plus\Guide_US_PACIFIC

inside this folder put your wg_config.xml and any ini that it will use.
that's all you need.

dariofol
Offline
Donator
Joined: 5 years
Last seen: 8 months

Right, i put same site ini, wg_config, in main folder works, in the other hawe warnings... Instead with tvtv.ca warnings in both location

Blackbear199
Offline
Blackbear199's picture
WG++ Team memberDonator
Joined: 9 years
Last seen: 22 hours

the warning about not want you to grab data mean nothing,read about it in the FAQ section.

it don't matter where the files are it don't change the fact that the site doesn't want remote software to access their site.
it doesn't mean u cannot grab data,they just don't want you to.

dariofol
Offline
Donator
Joined: 5 years
Last seen: 8 months

yes i understand i read FAQ, but from the same ip i grab from tvtv.us when i run in the main folder, not in the subfolder, and not work for tvtv.ca... how website know my grab?

Blackbear199
Offline
Blackbear199's picture
WG++ Team memberDonator
Joined: 9 years
Last seen: 22 hours

no idea as not work could mean anything.

do you take your car to garbage and simply say "it broke" and expect it to get fixed.

dariofol
Offline
Donator
Joined: 5 years
Last seen: 8 months

this is a wg_test_config

channel update="i" site="tvtvP.us" site_id="6304D/4451" xmltv_id="AWE TV">AWE TV
channel update="i" site="tvtv.ca" site_id="36625D/9705" xmltv_id="RDS 2">RDS 2
channel update="i" site="tvtv.us" site_id="36212D/594" xmltv_id="ESPN">ESPN
with 3 different site.ini: tvtv.ca.ini,tvtv.us.ini,tvtvP.us.ini (same tvtv.us.ini only with different time zone)
when i run in the main folder the two tvtv.us.ini tvtvP.us.ini works the tvtv.ca.ini give me warning... how recognize website my grab?
i put the same ini, the same wg_config in the WebGrab+Plus\Guide_US_PACIFIC and run program "C:\Program Files (x86)\WebGrab+Plus\bin\WebGrab+Plus.exe" C:\Users\Administrator\AppData\Local\WebGrab+Plus\Guide_US_PACIFIC i receive warning for all the 3 channel

Attachments: 
dariofol
Offline
Donator
Joined: 5 years
Last seen: 8 months

you think problem is the webgrab installation or there s some changes also in the website?

Blackbear199
Offline
Blackbear199's picture
WG++ Team memberDonator
Joined: 9 years
Last seen: 22 hours

most likely because webgrab save the response for robot checks in a file.
so it was saved when the site didn't check for this in your main folder.

since then the site enabled robot check and your new folder doesn't have this file and webgrab created a new one with the result it the site doesn't want you grabbing data.

delete the robots folder in ur main folder and u shud get the message there also.

the bigger problem I see is the no index error.

Blackbear199
Offline
Blackbear199's picture
WG++ Team memberDonator
Joined: 9 years
Last seen: 22 hours

the robot check will not cause webgrab not to grab data.
if nothing works no matter where it is then I say ini is broken.
I don't use it and aint bothering to download it,ask the one who made it to look at it.

dariofol
Offline
Donator
Joined: 5 years
Last seen: 8 months

its a good indication... the difference is in the tvtv.us.robots
working:
###### tvtv.us.robots
# No data!
# Errore del server remoto: (407) Richiesta autenticazione proxy.
# Dummy robots-data created by WebGrab+Plus:

User-agent: *
Disallow:

not working
###### tvtv.ca.robots
User-agent: *
Disallow: /tvm/
Disallow: /gn/
User-agent: WebGrab+Plus
Disallow: /

theres a way to fix?

Blackbear199
Offline
Blackbear199's picture
WG++ Team memberDonator
Joined: 9 years
Last seen: 22 hours

copy your robots folder to your new directory.

Blackbear199
Offline
Blackbear199's picture
WG++ Team memberDonator
Joined: 9 years
Last seen: 22 hours

another solution I just tesed that worked is inside your robots folder edit the file for that site and delete all the disallow line and save it again.

dariofol
Offline
Donator
Joined: 5 years
Last seen: 8 months

thanks very much. solved!!! also for the tvtv.ca i make a copy of the tvtv.us.robots and rename in tvtv.ca.robots

thanks very much

Blackbear199
Offline
Blackbear199's picture
WG++ Team memberDonator
Joined: 9 years
Last seen: 22 hours

this is the first time I have seen a site block grabbing because of the robots check.
I guess they are changing their ways.
the robots file is created when wg hasn't used the ini before(like new installation or first time using ini) and it saves the response to a file so it doesn't have to do the check everytime(it check the robots folder to see if has already been done).
your just lucky it doesn't do this check everytime and update the file.

Fallito
Offline
Donator
Joined: 8 years
Last seen: 8 months

Thanks, BlackBear. Confirmed it's working now.

filipekav
Offline
Donator
Joined: 9 years
Last seen: 6 months

Today I verified that tvtv.us.ini was no longer working.
I made some modifications and it worked again for me.
I am still learning how to fix these files.
if it is useful for the community here are the files you fix.

Attachments: 
jleiss
Offline
Joined: 5 years
Last seen: 4 years
filipekav wrote:

Today I verified that tvtv.us.ini was no longer working.
I made some modifications and it worked again for me.
I am still learning how to fix these files.
if it is useful for the community here are the files you fix.

the provided tvtv.us.ini file does not generate a channel list correctly, I have attached a fixed ini file that will do this.
be sure to rename the file to tvtv.us.ini and place into the USA folder under sitepacks

Attachments: 
jonathan7
Offline
Joined: 4 years
Last seen: 3 years

Hi jleiss
do you have right ini for tvtv.us ?

jleiss
Offline
Joined: 5 years
Last seen: 4 years

the ini posted above still works for me.

jonathan7
Offline
Joined: 4 years
Last seen: 3 years

it doesn't work for me
do you know why?

mat8861
Offline
WG++ Team memberDonator
Joined: 9 years
Last seen: 11 hours

post your log or at least tell us what error/problem you have ?

jonathan7
Offline
Joined: 4 years
Last seen: 3 years

Here's log details:

Attachments: 
jleiss
Offline
Joined: 5 years
Last seen: 4 years

read the WebGrab+ FAQ, under problem solving, What means "!! -- Warning :site doesn't allow EPG grabbing !!"?

mat8861
Offline
WG++ Team memberDonator
Joined: 9 years
Last seen: 11 hours

you can disregard "site does not allo epg grab..." it's just a warning. Now to check why you get no index page, we should check what siteini revision and which channel you are trying to get. If you post wour webgrab.config and .log we may help you.

jonathan7
Offline
Joined: 4 years
Last seen: 3 years

i don't have last revision
i have this [07/09/2018] r00ty
any channels don't work at all
can someone send to me the last version tvtv.us ini
also channel list

mat8861
Offline
WG++ Team memberDonator
Joined: 9 years
Last seen: 11 hours

I update the siteini as it changed url (as the one in post above).You need to follow instruction to create channel list, then make sure you follow above instruction for robots file.

jonathan7
Offline
Joined: 4 years
Last seen: 3 years

thank you. it's working now

Number1Guru
Offline
Joined: 5 years
Last seen: 3 years

Could you assist me in getting this to work.
I've downloaded the last ini file and removed the robot txt and still nothing works

mat8861
Offline
WG++ Team memberDonator
Joined: 9 years
Last seen: 11 hours
Number1Guru wrote:

Could you assist me in getting this to work.
I've downloaded the last ini file and removed the robot txt and still nothing works

post your WebGrab++.log.txt and WebGrab++.config.xml

zarethustra
Offline
Donator
Joined: 4 years
Last seen: 7 months

Hopefully it's OK for me to add on to this thread. I'm having the same issue, however this is a brand new setup and I'm pretty new to this.
I tried using the posted ini file, but go. I'm trying to generate my xml file for my area, and am not sure I am doing it correctly.

Attachments: 
mat8861
Offline
WG++ Team memberDonator
Joined: 9 years
Last seen: 11 hours

You didn't update siteini.pack, in your log you should have revision 3 and please pay attention to remarks in siteini.

zarethustra
Offline
Donator
Joined: 4 years
Last seen: 7 months

Thanks for taking the time to reply. I didn't realize there was a more recent site pack, I thought the one I grabbed in this thread was the most recent. You fixed my issue.

jonathan7
Offline
Joined: 4 years
Last seen: 3 years

How about tvtv.ca ini ?
do you have last revision?

mat8861
Offline
WG++ Team memberDonator
Joined: 9 years
Last seen: 11 hours
jonathan7 wrote:

How about tvtv.ca ini ?
do you have last revision?

same reply as post #37....canada folder

chriskeens
Offline
Donator
Joined: 6 years
Last seen: 2 months

I'm using Rev3. I have made sure the robots file has just the 2 lines and is set to read only. I am now getting thos error though...

[Critical]
Access to the path 'C:\Users\xxxxx\Desktop\WebGrab+Plus\robots\tvtv.us.robots' is denied.
[Critical]
at System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath)
at System.IO.File.InternalDelete(String path, Boolean checkHost)
at WGconsole.Q.CheckRobots(String 0, c 1)
at WGconsole.G.1(String[] 0)
[Critical] Unhandled Exception

Any ideas where I am going wrong?

mat8861
Offline
WG++ Team memberDonator
Joined: 9 years
Last seen: 11 hours

did you follow instructions in remarks of siteini ?

chriskeens
Offline
Donator
Joined: 6 years
Last seen: 2 months

My tvtv.us.robots file contains just these two lines:
User-agent: *
User-agent: WebGrab+Plus

...and it is set to read only.

When not in read only the file robots file is overwriten. When set to read only, webgrab throws the access denied error.

It doesnt bother me now though as I'm using tvguide.com instead

mat8861
Offline
WG++ Team memberDonator
Joined: 9 years
Last seen: 11 hours

don't know what setting you are using, in windows i set to read only no errors and grabs fine. WG++ should not give you errors if the file is read only....what are you using linux or windows ? What error do you have ?

chriskeens
Offline
Donator
Joined: 6 years
Last seen: 2 months

I'm using Windows. I have right-clicked the robots file and set read-only in the properties. The error that webgrab gives is...

[Critical]
Access to the path 'C:\Users\xxxxx\Desktop\WebGrab+Plus\robots\tvtv.us.robots' is denied.
[Critical]
at System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath)
at System.IO.File.InternalDelete(String path, Boolean checkHost)
at WGconsole.Q.CheckRobots(String 0, c 1)
at WGconsole.G.1(String[] 0)
[Critical] Unhandled Exception

EDIT: However I seem to have now fixed it. I deleted the existing robots file and then ran webgrab again. Obviously this then failed but it created a new robots file with the unwanted lines. I then removed those lines, saved and set as read-only. Now it works fine. I'm guessing it just didn't like the old robots file for some reason.

dariofol
Offline
Donator
Joined: 5 years
Last seen: 8 months

I noted that every two month i have this error... I delete the robot Path... Run webgrab that recreate the robot Path.. the n i modify the file content and read permission... After works

mat8861
Offline
WG++ Team memberDonator
Joined: 9 years
Last seen: 11 hours

Edit the robot file and leave only two lines, save.
after saving right click and check the box read only.Check in details windows attribute=RA
Run wg

Attachments: 
jonathan7
Offline
Joined: 4 years
Last seen: 3 years

i got this message about tvtv.ca and it's exist yet
but tvtv.us is okay
what should i do ?

mat8861
Offline
WG++ Team memberDonator
Joined: 9 years
Last seen: 11 hours

you can make a copy of robot file and name it tvtv.ca.robots if it is the robot file you talking about.

aitana19021997
Offline
Has donated long time ago
Joined: 4 years
Last seen: 4 years

Where can I exactcly find this robot file? Tvus is not working for me either

Blackbear199
Offline
Blackbear199's picture
WG++ Team memberDonator
Joined: 9 years
Last seen: 22 hours

same directory as ur wg config.xml
look for a robots folder,it will be in there..

Pages

Log in or register to post comments

Brought to you by Jan van Straaten

Program Development - Jan van Straaten ------- Web design - Francis De Paemeleere
Supported by: servercare.nl