Thank You Zdravko for expressing eloquently what we all think about
half-thought business process, and strategy.
While GOOG moto is Do No Evil, by restricting to the point of
ridicules they have created a world of US vs THEM. We the users are
the devil or THEM, and GOOG, or the US, is the protector of the
internet content.
GOOG is a business, which should operate as a business. However, GOOG
is content in staying in its little niche (advertisement, and
marketing)--its long-term strategy is non-existent at this moment. It
appears to create products that it doesn't want to sell, and creates
absurd restricts on their use. Many of us will gladly pay for GOOG
API search and Translate with reasonable restrictions, and we stated
it in this group.
Regards,
Dalila
On Apr 7, 2011, at 10:44 PM, Zdravko Gligic wrote:
In an absolutely worst case scenario, bandwidth and/or even cpu cycles
should not cost you any more than what they are charging for same
resources for custom Google App Engine implementations. Furthermore,
not being able to cache translations is self defeating because it does
not make sense. API based translations are NOT like YouTube video
players or search results over which they have to retain control in
order to be able to insert their advertising. Heck, while I do not
know how far they went with it now, a couple of years back one was
able to cache things like Amazon book title information - right down
to prices - but for a specified period of time and only that in order
to force certain amount of refreshing so that book prices were not
terribly out of date. Who would it hurt if GOOG translations were
cached while the original text remains unchanged and for a given
period of time, if they anticipate that the evolution of the
translation internals could through time produce better results?
Heck, I would be even ecstatic if they allowed Google App Engine
hosted clients to cache such translations within their BigTable with
full permission for them to monitor and even occasionally invalidate
such caches in order to force them being retranslated and refreshed.
While this alone is an excellent win-win suggestion, it does require
that their left hand (GAE team) knows what their right hand
(translations team) is doing.
I will give you a very, very sad scenario that I have noticed and in
which I find my self. YouTube videos are designed so that they can be
tagged with "developer tags" but in a way that barely scratches
anyone's surface of requirements. I for one, need to put lots of
videos into lots of categories and I find my self having to devise
schemes that in effect replace that which pushed GOOG to the very top
of the internet hill - which is information indexing and searching.
Meanwhile, here I am going through all sorts of girations and paying
good money elsewhere in order for me (the little nobody) to replace
what GOOG does so well with other peoples web sites' content BUT NOT
WITH THEIR OWN.
What they could do is just that. They could provide a GAE platform
with which third party developers could do to/with GOOG content (and
even services) what GOOG already does so well with other peoples
content AND THAT IS INDEXING AND SEARCHING OF CONTENT.
While all GOOG software engineering crews are all doing fantastic
things on their own, somehow I have always felt that the pieces are
not adding up to the whole that it could be. That is, there needs to
be lot more co-ordination and integration of everything that there is.
GOOG needs to do to it's content and services what Wal*Mart did to
retail marketing and that is better integration and packaging. Like
them or not, this is where and why Microsoft won so many software
wars, because their so-and-so pieces ended up playing nicely with each
other while over there on the Java side, everyone was doing their own
thing and for a long time and even to this day, plug-and-play is still
elusive in too many areas.
However, now that I am digressing, I would like to end this tirade by
thanking you for sitting through my personal therapy session. ;)
On Thu, Apr 7, 2011 at 6:42 PM, johnny <[email protected]> wrote:
in my case,, i have a bunch of constantly changing user entered data
that quite simply needs to be translated into the locale of any
connecting user. to complicate the problem, many sections of the site
are contextual. proxying things through google really isnt a good
solution. i wrote a simple servlet to to convert the text on the fly
using the translate api.
public static String translate(String string, String
langStringfrom ,
String langStringto) {
if (string.length() > 0) {
try {
Translate.setHttpReferrer("http://
www.foo.com");
String translatedText = Translate.execute(string,
Language.fromString(langStringfrom),
Language.fromString(langStringto));
System.out.println(translatedText);
return translatedText;
} catch (Exception ex) {
ex.printStackTrace();
return string;
}
} else return string;
}
in testing, it was slowing down the site by about 1.5 seconds a page.
not good, but considering all that was happening it was tolerable.
moved it to production and the load time quickly jump to 10+
seconds a
page... after checking the logs i now see that google is throttling
the connections. the transactions with google happen as a result of a
users clicking... so im ok with TOS there,, but if i translate all
the
data into every language a head of time to speed it up,, besides
having an disastrously bloated DB,, ive violated the TOS with google.
unfortunately im ive already found the best solution in simply using
this api... i just need bandwidth.... and unthrottled connections.
what does that cost?
On Apr 7, 5:11 pm, Zdravko Gligic <[email protected]> wrote:
It would be extremely useful to get as much detail as possible about
your particular use case (even if for privacy and/or competitive
reasons you told us about a totally invented use case that is TOS
representative of what you are doing) so that we can get a bit more
insight into what on earth GOOG is doing and whether it might be the
best for many of us to wait until they get serious at their end.
The
only thing that should matter to them are use case concepts and
especially how GOOG services are used in conjunction to sites' own
revenue generation models, etc. It is totally mind boggling that
success of a site should make any difference in the overall
allocation
of how much one can consume. In fact, the whole thing has a rotten
smell to it - one of not being at all interested in good ideas that
are well executed but only in perhaps identifying good ideas that
are
poorly executed and that are therefore lot easier to purloin. Now
having written this previous sentence, even sensible me can not
believe that I wrote it. However, I ended up leaving it in purely
as
an illustration of how little sense this current scheme makes. To
fine people who work on this at GOOG, I am sorry.
On Thu, Apr 7, 2011 at 3:47 PM, johnny <[email protected]> wrote:
im in the same boat... worked like a dream in testing.. moved it to
production and my connections are getting throttled.
ive read that api license 5 times. the only way i can reduce
traffic
and provide the translation service is to violate the terms of
service. google needs to address this problem even if the
solution is
to offer us a licensed instance of google translate for dedicated/
untrottled use.
On Apr 6, 1:53 pm, LAHatfield <[email protected]> wrote:
Our website is built up dynamically in javascript at runtime and
involves a number of elements which are not visible from the
outset
but later are shown. I have tried adding googles translate
javascript
to our page in the hope it would just work but it cause the
whole page
to fall over.
I have since starting converting all text as I write it out to the
page, passing it through googles translate api. However I am
getting
issues on the 500 character limit, as well as issues with timing
as i
need to do some 800 translations which takes time and i will
quickly
get terms of service error messages. I have though about caching
it in
the application object the first time that particular language is
requested but am worried that also breaks googles rules.
Has anyone come up with a way around this? Is it possible to pay
google to get unlimited access to the API? Or are there any
other good
free/paid for tools out there which i can plug into?
Thanks
--
You received this message because you are subscribed to the
Google Groups "Google AJAX APIs" group.
To post to this group, send email to [email protected]
.
To unsubscribe from this group, send email to [email protected]
.
For more options, visit this group athttp://groups.google.com/group/google-ajax-search-api?hl=en
.
--
You received this message because you are subscribed to the Google
Groups "Google AJAX APIs" group.
To post to this group, send email to [email protected]
.
To unsubscribe from this group, send email to [email protected]
.
For more options, visit this group at http://groups.google.com/group/google-ajax-search-api?hl=en
.
--
You received this message because you are subscribed to the Google
Groups "Google AJAX APIs" group.
To post to this group, send email to [email protected]
.
To unsubscribe from this group, send email to [email protected]
.
For more options, visit this group at http://groups.google.com/group/google-ajax-search-api?hl=en
.
--
You received this message because you are subscribed to the Google Groups "Google
AJAX APIs" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to
[email protected].
For more options, visit this group at
http://groups.google.com/group/google-ajax-search-api?hl=en.