Jump to content

Not finding links & spam sites that refuse reports


Windrider6

Recommended Posts

Two issues:

1. SpamCop not finding links.

2. spam website cannot be reported.

It seems consistent that SpamCop cannot find any links when the spam I'm reporting is in HTML format.

Example:

http://mailsc.spamcop.net/mcgi?action=gett...rtid=1476939778

I haven't kept up with details of SpamCop for awhile now, I've just been using the service (mail & reporting), and hoping that spam will be parsed reported properly.

It's the e-mails that make it through to my mailbox that I notice the problems in parsing, because I paste & report them, and see the results.

In this example spam, the links:

<DIV><A href="http://www.TheFreeProject.com/index.php?referral=2052"><FONT color="#0000FF"><U>If you want to give it a try, go here.</U></FONT></A> </DIV>

<DIV> </DIV>

<DIV><A href="http://www.TheFreeProject.com/index.php?referral=2052"><FONT color="#0000FF"><U>http://TheFreeProject.com/index.php?referral=2052</U></FONT></A></DIV>

Neither are reported to be found, even though they are not being obfuscated in any way.

So I paste the link into another reporting form and get a reporting address:

abuse[at]dreamhost.com

But what happens when I add that address for them to get a report?:

abuse[at]dreamhost.com does not wish to receive user-copied reports.

This website is paying spammers to send them visitors. It seems to me, this is a perfect example of the spammers, and who they work for, getting off easy. I doubt that the message source (rr.com) will do anything.

Link to comment
Share on other sites

1. Your provided link isn't worth a hoot. The "mailsc" part of the URL limits access to "paid-account" users .... the "action=gettrack&reportid=1476939778" further limits access to "you" ....

2. This then makes your code snippet in two ways .. one, it would have been a repeated data point had you povided a Tracking URL instead ... and as is, the context of that snippet is unknown .. header description, MIME description, etc.

3. Have you looked at the FAQ here?

This website is paying spammers to send them visitors.

Possibly true, yet why are you following links in your spam? (hypothetical question)

Link to comment
Share on other sites

OK. It's been awhile, and I forgot what the proper link was. Here's the tracking URL:

http://www.spamcop.net/sc?id=z790471946z5a...d9fce483fa0043z

I didn't think seeing the whole spam was very important, as there was nothing there to reasonably explain why the link was not detected, but I could be wrong (sometimes I think I should change my middle name to "Often Wrong"). I quoted the pertinent link, and there was nothing there that would hide it.

There is a multipart boundary defined in the header of the spam, but not used in the body. I know that SpamCop used to have problems with that some time ago (when I last gave up trying to follow up on technical problems in SpamCop), but I figured that SpamCop would have fixed that by now. I figure that SpamCop should ignore the multipart boundary definitions, and look at the content of the spam, whatever is there, whether it is defined in a boundary or not.

Of course, none of this matters, if we don't want to go after the spamvertised sites, ie. the people who are paying the spammers.

To Wazoo:

You asked why I would follow links in my spam. I don't.

When a spam begs you to visit a site, and the link contains "index.php?referral=2052", it is safe to say that the spammer is referrer #2052, and he will get a few cents if you click on the link. Thus the reason for the spam.

Link to comment
Share on other sites

My opinion as follows (originally posted in Linear Post #7 in Topic Parsing problem?) stands.

the SpamCop Parsing and Reporting Service should not be HTMLCop, MIMECop, or SMTPCop, unless it wants to complain to the spammer's ISP about the spammer's disregard of HTML, MIME, and/or SMTP standards while it's complaining about the spam, rather than refusing to deal with the URL(s) in such spam because they wouldn't be clickable in perfectly standard strict HTML-rendering mail readers.
Link to comment
Share on other sites

30956[/snapback]

I didn't think seeing the whole spam was very important, as there was nothing there to reasonably explain why the link was not detected, but I could be wrong

As it turns out, the 'reason' it was not detected was for the exact reasons you go on to state yourself. The body contruct does not match the header description of the included body content. Other folks with some e-mail apps would be describing this spam as a "blank e-mail" as the body wouldn't be showing on their screen.

There is a multipart boundary defined in the header of the spam, but not used in the body.  I know that SpamCop used to have problems with that some time ago (when I last gave up trying to follow up on technical problems in SpamCop), but I figured that SpamCop would have fixed that by now.  I figure that SpamCop should ignore the multipart boundary definitions, and look at the content of the spam, whatever is there, whether it is defined in a boundary or not.

Not sure of exactly what history you may be talking about, but the issue your sample is demonstrating is one that caused the Outlook/Eudora work-around web-form page ... in an attempt to allow the processong of MIME-described bodies that Outlook/Eudora had mangled in their processing. Ignoring the content-description and acting on 'anything seen' in the body would lead to a slew of bad reports.

Link to comment
Share on other sites

Ignoring the content-description and acting on 'anything seen' in the body would lead to a slew of bad reports.

30973[/snapback]

I respectfully disagree. IMHO, anything in spam that could be construed by a strictly-RFC-compliant email client, a Microsoft email client, or even a human as something to click on or to paste into the address bar of a browser is a spamvertized URL or email address, and every spamvertized URL should be reported to the administrator(s) of the system(s) that host that URL, of course with the exceptions of IBs (Innocent Bystanders) that have been so marked by the appropriate administrators. It is high time for the Parser to stand up for the rights of SpamCop's paying customers to report ALL of the reasonably-detectable spamvertized URLs they are bombarded with each day, rather than the rights of spammers to avoid reporting of their spamvertized URLs by manipulating their spam in non-RFC-compliant manners that Microsoft (and perhaps other) email clients make clickable anyway, and that some humans insist on pasting into their browsers. I understand that TPTB may get flack from administrators whose URLs are so reported, but why would such administrators complain rather than marking their URLs as IBs?
Link to comment
Share on other sites

I understand that TPTB may get flack from administrators whose URLs are so reported, but why would such administrators complain rather than marking their URLs as IBs?

30977[/snapback]

Because, as I understand the history, this has happened in the past to a large enough degree that the limitations were put in place.

And people in this day that click on any and all links or even worse copy/paste into a browser something just because it is there get what they deserve, including listed on blocklists because they have just infected themselves and are now helping the spammers. There is enough information out there in the regular media that they are on their own.

I'm all for keeping the spammers out of my inbox but I am not out to protect the world. There will always be criminals....protect yourself.

Link to comment
Share on other sites

Different specific, different local, but related ....

From: "Mike Easter"

Newsgroups: spamcop

Subject: Re: Weekend education time...

Date: Sat, 30 Jul 2005 21:13:32 -0700

Message-ID: <dchj59$cg6$1[at]news.spamcop.net>

J G wrote:

> David Bolt scribbled:

>

>  > davjam[at]davids:~> dig +short "almxqcawxbzo.net.

> .pqqjdspvlwtaqf3sr6kv.mcilluderkb.info"

>> 221.7.209.72

>>

> Odd, Sam Spade resolves it to 194.126.190.16 which is our favorite

> russki [at]tekcom...

One of the problems with resolving errors in URLs is that error

correction is going to be different in a websniffer which will be

different in SS's GET or its browse web function which will also be

different in NetDemon's http browse function which will be different in

IE or FireFox or Netscape or Opera or whatever's tendency or willingness

to resolve a misconstructed URL.

Designing a browser or parsing algorithm for URL error intolerance would

be maddening to me.  If it isn't a real URL, then doing all kinds of

tapdances to imagine or rather 'make up' what kind of resolution the

unpredictable error correcting process would perform doesn't mean that

the made-up answer is correct.

It seems to me that if something 'shouldn't' resolve from a correct

construction point of view, then it /shouldn't/ resolve, even if it

might resolve in some kind of zany error tolerant browser, which chooses

to resolve something improperly -- even if it works.

The world would be better off if there were a rule about browser related

errors and refuse to display them.  "Sorry, this page has errors.  Does

not display."  Then that would force all of the bad page editors to fix

them, rather than leave them up for the browser which is the most

incompetent but which happens to be the most error tolerant to come up

with something.

Error intolerance shouldn't be the sine qua non of browser capability.

--

Mike Easter

kibitzer, not SC admin

In a follow up post;

oops.

Mike Easter wrote:

> Error intolerance shouldn't be the sine qua non of browser capability.

s/intolerance/tolerance/

Error tolerance shouldn't be the sine qua non of browser capability.

--

Mike Easter

kibitzer, not SC admin

Link to comment
Share on other sites

What ever happened to the "general principle of robustness: be conservative in what you do, be liberal in what you accept from others" (Internet Standard 7, RFC 793, "Transmission Control Protocol", Section 2.10 "Robustness Principle", http://www.rfc-editor.org/rfc/rfc793.txt)? That Principle is generally attributed to Jon Postel (may he rest in peace). I think it should apply to Microsoft and other email clients, browsers, and dns resolvers (which are a little too liberal in acceptance and clickability of URLs and URL-like strings, hostnames and hostname-like strings, and IP Addresses and IP=Address-like strings), as well as to the SpamCop Parser (which should be just as liberal to compensate).

Link to comment
Share on other sites

What ever happened to the "general principle of robustness:  be conservative in what you do, be liberal in what you accept from others" (Internet Standard 7, RFC 793, "Transmission Control Protocol", Section 2.10 "Robustness Principle", http://www.rfc-editor.org/rfc/rfc793.txt)?  That Principle is generally attributed to Jon Postel (may he rest in peace).  I think it should apply to Microsoft and other email clients, browsers, and dns resolvers (which are a little too liberal in acceptance and clickability of URLs and URL-like strings, hostnames and hostname-like strings, and IP Addresses and IP=Address-like strings), as well as to the SpamCop Parser (which should be just as liberal to compensate).

31021[/snapback]

I think spam happened to 'being liberal in what you accept from others.'

Since spamcop doesn't add spamvertised sites to the blocklist, there is no need to be accurate or complete in reporting them. Considering the complexity of 'finding' some websites, the amount of programming to make it accurate would be way beyond the usefulness, I imagine.

The regulation of domains is a tricky business if one wants to avoid censorship. Given that the Internet is still run entirely on netiquette, the only polite way to stop spam is to reject it at the server based on the reputation of the sending IP address. There is no concept of 'punishment' of spammers in blocklists; it is simply a matter of ignoring those who are rude and pushy or thoughtless. Those who want to force spammers to stop will have to change the entire concept of the Internet before they succeed. And there will still be those who contravene the regulations.

Miss Betsy

Link to comment
Share on other sites

The regulation of domains is a tricky business if one wants to avoid censorship.  ... And there will still be those who contravene the regulations.

Miss Betsy

31023[/snapback]

Miss Betsy is on point yet again. Its hard to beat someone into submission with "please" and "thank you", but it is worth the effort to maintain a civil society.

Link to comment
Share on other sites

What ever happened to the "general principle of robustness ..."?

31021[/snapback]

I'm with you Jeff G. Don't know why things headed off at a tangent immediately after (from communications protocols to net etiquette). The parsing of web links is not an essential feature but it is one of the things that keeps users coming to SpamCop for assistance in their individual wars with those who spam them unmercifully and who come close to rendering the whole email system (indeed, the whole internet) unusable for all of us. Some spammers go to quite extraordinary measures to hide their links from parsing which strikes me as an excellent reason to frustrate their desire to protect those links. That objective requires the parser to lag not too far behind the resolving ability of the major browsers/email applications.

None of which agrees with what Mike Easter was saying, he was overlooking the "general principle of robustness", and for which I am grateful to you for digging out. In retrospect, the principle is most "inconvenient" and has surely been applied beyond the scope of its original intent (certainly wouldn't disagree with that contention) - but ignoring the ability of spammers to profit from its (effective though unequal) application is not the way to win the war. Yes the critical task is to identify the originating IP but if we want to enlist and keep reporters they consistently ask for a bit more. They can't all be wrong.

Link to comment
Share on other sites

Since spamcop doesn't add spamvertised sites to the blocklist, there is no need to be accurate or complete in reporting them.

31023[/snapback]

I must respectfully disagree with you on that point, Miss Betsy. There surely IS a need to be accurate and complete in reporting spamvertised sites. The more spamvertised sites are reported accurately and completely, the more ISPs tend to do something about them. The more ISPs do something about them, the more the spammers tend to be forced away from those ISPs. The more spammers are forced away from those ISPs, the more spammers tend to be forced towards more expensive ISPs or out of business. The more spammers that are forced out of business, the less spammers will tend to still be in business bothering us. The less spammers are in business bothering us, the less spam we will tend to get, and the cleaner our mailboxes will tend to be. It all starts with being accurate and complete in reporting spamvertised sites.
Link to comment
Share on other sites

Jeff G why don't you just quote yourself at:

http://forum.spamcop.net/forums/index.php?showtopic=4632

or

http://forum.spamcop.net/forums/index.php?...indpost&p=30977.

I guess if you keep saying it, maybe some one will believe.

31054[/snapback]

I responded to a particular statement. In the interest of a free and open discussion, why don't you believe? Thanks!

Also, please be aware that the second URL you pasted above "http://forum.spamcop.net/forums/index.php?...indpost&p=30977" doesn't work because you quoted it incorrectly ("..." shouldn't appear in a SpamCop Forum URL, only in the text representation of it on screen (due to the excessive length due to a design feature/flaw)). The correct URL is http://forum.spamcop.net/forums/index.php?...indpost&p=30977.

Link to comment
Share on other sites

I must respectfully disagree with you on that point, Miss Betsy.  There surely IS a need to be accurate and complete in reporting spamvertised sites.  The more spamvertised sites are reported accurately and completely, the more ISPs tend to do something about them.  The more ISPs do something about them, the more the spammers tend to be forced away from those ISPs.  The more spammers are forced away from those ISPs, the more spammers tend to be forced towards more expensive ISPs or out of business.  The more spammers that are forced out of business, the less spammers will tend to still be in business bothering us.  The less spammers are in business bothering us, the less spam we will tend to get, and the cleaner our mailboxes will tend to be.  It all starts with being accurate and complete in reporting spamvertised sites.

31052[/snapback]

I do not believe your series of events is accurate because the only spam I have gotten over the last year is from the same ISP's which do nothing about it....China, Russia, etc. How many sites have you actually taken down in the last year? How many ISP's have actually done something about the spamvertized sites?

I believe the vast majority of spammers are already stuck to a few spammer friendly ISP's but their spew continues because they can get their message out (through zombied machines, etc.).

I would rather work more on getting the ISP's to stop the email flow, causing less people to see the messages, causing less income for the spammers. I believe this is the more effective avenue for spamcop to pursue with it's resources. Trying to track down every intentionally messed up link, portion of link, etc. is going to take a lot of resources because for every new trick used, a new check in the code will need to be developed. That, in turn will slow dow the processing for everyone.

Just my opinion...

Link to comment
Share on other sites

If you are suggesting an option to make slow parsing faster by not looking at message bodies at all, I think that's a viable option for a certain segment of the Reporter community that feels the need for speed, but doesn't trust Quick Reporting. I just am not in that segment.

Link to comment
Share on other sites

If you are suggesting an option to make slow parsing faster by not looking at message bodies at all, I think that's a viable option for a certain segment of the Reporter community that feels the need for speed, but doesn't trust Quick Reporting.  I just am not in that segment.

31058[/snapback]

No, I believe that any number of people looking for spamvertized web sites using a more extensive search will slow down the processing for everyone, including the quick reporting. All processing is done on the same set of machines as I understand it. Where there are now (say for example) 5 different ways to search for links, (href within html, http within text, etc), a more extensive search will need probably 3 times that number, slowing processing of that submission and everything else queued to run.

If they could/would use the current scheme for most people and have those that want everything that could in any way could be considered a link running on a different set of machines, I would say fine. As I stated, I do not find reporting most spamvertized web sites to be very useful because they are on spammy sources to begin with.

Again, how many of your reported (manually or via spamcop) spamvertized web sites have been taken down by the ISP? What percentage of the cases does this happen?

Maybe my spam over the years has not been representative of what is out there, but from my experience here, reporting more links will not help in any significant way. Adding significant load to the servers for insignificant gain does not make sense to me.

Link to comment
Share on other sites

Again, how many of your reported (manually or via spamcop) spamvertized web sites have been taken down by the ISP?  What percentage of the cases does this happen?

31063[/snapback]

Sorry, I don't know, I don't have time to track them all.
Link to comment
Share on other sites

  • 2 weeks later...

I use Spamcop only for free reporting - my personal reward is the satisfaction of knowing that I'm sticking it to those bastards by shutting down their web sites. It makes sense to me to focus on the web site, since that's the money maker, there are fewer of them, and the web site is a visible target for my indignation. I was under the impression I'd had some part in getting many of them shut down.

I have not been so motivated by the prospect of reporting the e-mail source, because I had thought that was a pretty hopeless job, since e-mails can come from seemingly billions of possible sources these days (Comcast). It's just not as sexy as shutting down web sites.

Until the last few months I was getting spam hosted at a wider range of ISPs than these days. Now it's mostly a resilient handful such as Tekcom, and CNC, etc. There's an upstream spam filter so maybe that's why.

But the point is that I allowed myself to imagine we were getting down to the last holdouts, and that that was because of effective reporting of spam sites by Spamcop and effective responses by the responsible ISPs. If that battle is being won, then it's important to keep the pressure on there

Another point is that shutting down spammer web sites is a highly motivating thing, and it's worth devoting some effort to reporting web sites, to encourage on-going reporting.

Link to comment
Share on other sites

I doubt that the message source (rr.com) will do anything.

30813[/snapback]

They appear to have sent the following or a similar message as a first step:
----- Original Message -----

From: "RR Webmaster" <webmaster[at]hvc.rr.com>

To: [all account holders]

Subject: Abuse Issue: IP Address at time on date

Dear Road Runner Subscriber:

Please read this e-mail completely, as it pertains to the continuation of your Road Runner account. The master account holder should respond to this email.

We have received multiple complaints that unsolicited email was sent from your account.  Having reviewed the complaint, we have confirmed that the IP address associated with the email traces back to your Road Runner account. 

Please make sure that no one is using your computer to perform this activity. If you are sure that no one has performed this activity, then your computer could be infected with a virus or trojan horse which has the ability to generate unsolicited email (spam/junk mail). Viruses can be brought into your system through file downloads and/or email attachments.  In order to prevent this activity from continuing, we suggest that you run an updated virus scan to target and delete this virus or trojan from your computer.  It is our policy to follow up on any incident that may be in violation of Road Runner's Acceptable Use Policy and/or Customer Agreement.

Accordingly, we strongly recommend that you take steps to secure your computer, this includes installing updated anti-virus software, installing a firewall, and keeping your operating system patched.  We view the sending and relaying of unsolicited email as a serious matter, and wish to remind you that any additional complaints regarding your account may, after review, lead to the suspension or cancellation of your Road Runner account.

If you do not have an anti-virus program or firewall,  we recommend that you visit one of the web sites listed below. These are sites of commonly known anti-virus programs (some free ones are included in this list).  Time Warner Cable and RoadRunner do not endorse or support any of these products.  They are listed for your reference and represent only a portion those commercially available. Additionally, Road Runner is now providing a free antivirus/firewall (found on the download page of the Road Runner home page www.rr.com).

Because this activity does put our service at risk, as well as our customers, we do ask that you reply to this email indicating action has been taken to resolve this issue.

If you have any questions, please send an email to internet.security[at]twcable.com

Sincerely,

Abuse & Security Coordinator

Time Warner Cable High Speed Internet

Saugerties, NY

Anti-Virus Software

Most anti-virus software will detect programs that may allow remote access to your computer (Trojans), or perform activities or functions that may corrupt data on your computer. If you decide to use an anti-virus program, remember to keep it updated so you will be protected from new viruses. Here are just a few of the many anti-virus programs out there.

http://www.pandasoftware.com/  (both web and PC based)

http://housecall.antivirus.com/  (both web and pc based, free web trial)

http://www.grisoft.com/us/us_index.php (AVG software)

http://www.free-av.com/ (AntiVir Personal Edition)

http://www.mcafee.com/ (both web and pc based, free web trial)

http://www.symantec.com/nav/ (pc & Macintosh based)

http://www.datafellows.com/products/anti-virus/ (pc based for networks

Firewall Software

Most firewall software will detect  both other computers trying to gain entry in to your computer and tell you about programs in your PC that are attempting  to access the Internet without your knowledge. Here are just a few of the many firewall programs out there.

http://www.kerio.com/kerio.html (Kerio Software)

http://soho.sygate.com/default.htm (Sygate Technologies)

http://www.mcafee.com/ (Mcafee Software)

http://www.symantec.com/  (Norton)

http://www.zonealarm.com ( Zone Labs)

In addition, we recommend that you keep all software, especially Internet-related software, up to date and fully patched to assist in preventing unauthorized access or exploits.

You can find a list of some of these programs by visiting the following web sites:

http://windowsupdate.microsoft.com/ (Microsoft Windows Operating System Updates)

http://www.moosoft.com/thecleaner (The Cleaner - Trojan Cleaner)

Time Warner Cable's site provides links to third-party web sites, which are not under the control of Time Warner Cable. Time Warner Cable makes no representations about third-party web sites. When you access a non-Time Warner Cable web site, you do so at your own risk. Time Warner Cable is not responsible for the reliability of any data, opinions, advice, or statements made on third-party sites.Time Warner Cable provides these links merely as a convenience. The inclusion of such links does not imply that Time Warner Cable endorses, recommends, or accepts any responsibility for the content of such sites

Link to comment
Share on other sites

They appear to have sent the following or a similar message as a first step:

31503[/snapback]

...How did you become aware of this, Jeff G -- do you know a Road Runner customer who received this?

...This is great. Now, if we could just get ESPs (e-mail service providers) and ISPs to require all customers and prospective customers to demonstrate that they have all the necessary protection before granting them access! "But I can dream, can't I?" :D <big g>

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...