Automated software in SEO. Is it really evil?

Author Topic: Automated software in SEO. Is it really evil?  (Read 811 times)

Offline AddisonTorres3047Topic starter

  • Trade Count: (0)
  • Semi-Newbie
  • *
  • Thank You 0
  • Posts: 33
  • Karma: 1
  • Gender: Male
    • Amazon GTX 680
Automated software in SEO. Is it really evil?
« on: 07-18-2012, 02:57:51 »
There are tons of messages and posts like “don’t you ever use automated software or else you’ll be banned” all over the Internet. But is it really so harmful and dangerous? Usually people who write these posts have no idea how such software works and what does it actually do.
Some of SEO experts really had a bad experience with submitters, article rewriters and other SEO tools. And now they try to avoid previous failures. Those who are a little bit smatter still use submitters but with a care. And I’ll tell you that it is a wise decision and will explain why.

I am a software developer and used to writer social bookmark, article and directory submitters. I tried different programming frameworks; I wrote under different platforms and gained some experience in automated software development to explain how it works, what it does and what you should expect from it.

First of all you should understand that there are only a few ways to build an automated submitter (at least its core).
1. You can use website's API.
2. You can send raw HTTP requests like an ordinary web browser does when you press the submit button.
3. You can integrate the web browser component right into you application.

The first approach is a simplest, easiest, fastest and the most reliable. Besides you will never get banned by a website that provides API (if you follow all rules and guidelines of course). The main disadvantage of this approach is that there are only a few websites that provide an API. The second approach is a most difficult, yet it lets you to build multithreaded application that can submit on dozens websites in a row. Usually developers don't bother to mimic web browser HTTP request precisely and don't include HTTP headers like Accept, Host, Referrer, User-Agent and other. Such applications get banned really quickly. And the last approach is the most common. Such applications are really slow, compared to the first two, since web browser component send dozens of requests to load a web page, its scripts, styles and images. But on the other hand there is no way for a web server or Google to distinguish such application from an ordinary user activity.

There is no way to get banned just because you used automated software in your SEO. It is all about how you use it. If your application manages accounts, fills registration and login forms for you and automated common routings, you just save your time. But if you application posts hundreds of messages in blogs and forums in a single day - that is a problem.

So don't write "don’t you ever use automated software or else you’ll be banned". It is not true. Use it, but do it wisely. And if you have read this article to the end, you might find it interesting to try my SEO Browser, watch some video tutorials and save a lot of time in your feature SEO.


Related Topics

  Subject / Started by Replies Last post
11 Replies
Last post 02-21-2012, 03:46:34
by kulwantnagi
0 Replies
Last post 10-24-2011, 02:43:18
by manvbf
1 Replies
Last post 11-01-2011, 06:15:26
by bedrik
1 Replies
Last post 11-11-2011, 04:08:03
by C.Rebecca
4 Replies
Last post 08-18-2016, 21:15:27
by mamxanhcn