I won't get onto a rant about web services (XML-RPC and SOAP, specifically). But I am going to show you the secret weapon for using web services from Applescript. If you're reading this, chances are you know that Applescript can access web services on MacOS X. You may have already tried using Applescript to access a web service - and chances are you have pulled out a few hairs trying. Applescript's SOAP and XML-RPC implementation is pretty good, but leaves a bit to be desired. The description you have of a web service may seem clear cut, but translating that into an Applescript call can be very frustrating because of how Applescript names the different parts of the remote function call headers. But enough about that. This is the easy way, assuming your web service has a WSDL description. Note that in general, web services that are coming from ASP .NET servers haven't worked well for me from any language, which is a shame since there are just so many out there. So if you see ".asmx" in your SOAP url, you may be out of luck. Open Terminal.
Open the "WSStub.as" file on your desktop. You now have all of the Applescript code needed to access the web services described by the WSDL file at the url you gave WSMakeStubs. Nifty, huh? [ 4/02/2005 04:17:00 AM ] [  ]
Monday, March 28, 2005
Java Cryptography Extension 1.4.2 Unlimited Strength Jurisdiction Policy for Mac OS X
The default distribution of the JCE allows as Sun calls it 'strong, but limited strength cryptography'. This basically means that you cannot use RSA keys bigger than 2048 bits, and no symmetric ciphers that use more than 128 bits. ElGamal is not allowed at all, thus DH/DSS cannot be used for encryption.
This holds true for most non-Sun crypto providers (Cryptix and BouncyCastle are what I have been working with). While there are install instructions for other platforms, the Mac OS X install works slightly differently. Instructions for enabling “unlimited” JCE crypto on Mac OS 10.3.x [here] (Thanks to the infamous Vinnie for hosting the MacCrypto forum, no matter how low the traffic is) UPDATE If you are working with JCEs on Mac OS X, you'll also be interested in [this document from Sun] on the java.security file, which on Mac OS X is located at:
Now, because at the time search engines took months to update, the results of deploying these engineered pages in the wild would take a while. Unfortunately, management didn't see fit to see that all the way through, so they were deployed and more or less left there, and the project was cancelled. The company tried buying keywords through GoTo.com and other services, but these had poor returns. Several months later we were going through error logs to diagnose a completely different problem and saw thousands of 404 errors for pages. It turns out that the engineered pages did indeed get indexed, and did indeed do well - but an update to the website had actually wiped out their parent directory. After figuring out the source of the problem, we set up a redirect for those pages using mod_rewrite and sent all of that traffic through our affiliate system. It turned out to be a good amount of traffic overall, though sales had good days and bad days (the keywords chosen for the pages we deployed weren't exactly ideal).
A few months later we had one of the more infamous search engine optimization "experts" come and give our web developers a lecture. His techniques were crude at best, using Excel to calculate the keyword weight of meta tags, and using poorly formed HTML to increase rankings. Being the smartass I am, I stumped him on a few questions, and basically threw the premise of the project I had done in front of him: Me: "Well, what if you could automate all of that, since it's just math, and generate the pages to get better rankings? Or better yet, have a process that runs against the search engines, continuously refining your pages to get the best results?" SEO: "You'd be unstoppable. But nobody has the technology yet."
During the lunch break he offered to hire me, out of earshot of management.
Now today, most of that project is irrelevant. Google controls 40% of the web search market, and those techniques don't work on Google. Instead of indexing content, Google relies heavily on indexing a page's link affinity, originally with their [PageRank] algorithm. Now, when I was working out how to "attack" search engines, this was something I thought they would try eventually, though I never thought it would have the success that PageRank did for Google. Systems that rank pages based on link affinity are actually in many ways easier to attack than systems that are doing full lexical analysis of documents. It's not that difficult to get many, many sites to link to yours, and of course there are plenty of ways to automate that. With the recent popularity of web logs, comment spam has been an issue. A script will post links to a site in a web log's comment form, and that increases the site's link affinity in Google. That's one way to increase your search ranking in Google, though not particularly neighborly. Google's real strength lies in it's backend - the custom filesystems they use to index the web quickly and incrementally. Just as fast as you can spam out pages that link to your site, they can reindex with you at the bottom of the rankings. They can do all kinds of things that make your life difficult - but they play by the rules. If you get a high ranking, while abiding by their rules, you will not be penalized.
And hey, it isn't that hard, even if an AdWords campaign is a better investment. [ 3/27/2005 09:10:00 PM ] [  ]