[google-appengine] Re: mounting buckets or accessing gcp storage from AI platform notebook instance

2019-09-23 Thread 'Ali T (Cloud Platform Support)' via Google App Engine
Hi,

One way to do this is through gcsfuse 
. Once you create an AI 
Platform notebook, click on the notebook’s name, ssh into the VM and follow 
the gcsfuse installation instructions for Ubuntu and Debian 

.

Once gcsfuse installed, run the commands below to mount your bucket to the 
Jupyter notebook:

$ cd /home/jupyter/
$ sudo -s su jupyter
$ mkdir MOUNT_DIRECTORY
$ /usr/bin/gcsfuse GCS_BUCKET MOUNT_DIRECTORY

Afterwards, in your Jupyter notebook interface, you will find a 
MOUNT_DIRECTORY folder which will be synced to your GCS bucket. 

-- 
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to google-appengine+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/google-appengine/2492987c-ea29-405d-a50a-f26e80ec14bb%40googlegroups.com.


[google-appengine] Re: Google Speech-to-Text

2019-07-30 Thread 'Ali T (Cloud Platform Support)' via Google App Engine
Hi,

The model used should be decided in accordance from where the audio being 
transcribed originates. If the audio is not specific to one of the 
alternatives models, choosing the default model would be appropriate. You 
can find a breakdown of each model 

 
in the request configuration documentation. If you do want to use speaker 
diarization 
,
 
it’s only available for the phone_call model. 

Regarding the audio length, whether you chose the default, video or phone 
model, you shouldn’t have any problems transcribing a 3 hour long audio. 
For audio length, the content limits 
 depend on the 
request type rather than the model chosen. 

-- 
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to google-appengine+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/google-appengine/e6388ee4-b017-4294-8fa6-b7675c9e7684%40googlegroups.com.


[google-appengine] Re: Is there any way of triggering DataPrep based flow from AppEngine?

2019-07-23 Thread 'Ali T (Cloud Platform Support)' via Google App Engine
Hi,

When you launch a Dataprep job, it actually creates a Dataflow template. 
That being said, once the Dataprep job is launched at least once, you can 
run it as a Dataflow template 
. 
Thus, from your app, when the button is clicked, you could launch the job 
using the Dataflow template REST API 

. 

The template created from the Dataprep job will be located in GCS 
.
 
To locate it, in the Dataprep UI, click on the jobs tab. When hovering over 
the historical jobs, on the far right side you will have a “view results” 
button. On this page, on the right hand side, there will be a “Job Summary” 
section which indicates the GCS location of the created Dataflow template.

Lastly, prior to doing so, I would suggest going over the known limitations 

 
when running a Dataflow job from a Dataprep created template. 

-- 
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to google-appengine+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/google-appengine/61681ce8-70c4-44bf-bb85-3894a4c26e62%40googlegroups.com.


Re: [google-appengine] Re: How export BiqQuery data to google Storage bucket either using with API or Java SDK automatically ?

2018-10-01 Thread 'Ali T (Cloud Platform Support)' via Google App Engine
There are currently no predefined templates from BigQuery to Google Cloud 
Storage. However, Dataflow does allow you to create your own templates to 
fit your use case. You can find information pertaining to creating your own 
template at the following documentation[1]. Information regarding the 
required classes and functions to read from BigQuery and write to Google 
Cloud Storage for Java are available on the Apache Beam API reference 
page[2]. Additionally, as your use case is to automate the export, you can 
do so using App Engine Cron services[3].

[1] https://cloud.google.com/dataflow/docs/templates/creating-templates
[2] https://beam.apache.org/documentation/sdks/javadoc/2.6.0/ 
[3] 
https://cloud.google.com/blog/products/gcp/scheduling-dataflow-pipelines-using-app-engine-cron-service-or-cloud-functions

-- 
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to google-appengine+unsubscr...@googlegroups.com.
To post to this group, send email to google-appengine@googlegroups.com.
Visit this group at https://groups.google.com/group/google-appengine.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/google-appengine/2f115eb5-be50-476e-8212-3c5b7c341d83%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


[google-appengine] Re: How export BiqQuery data to google Storage bucket either using with API or Java SDK automatically ?

2018-09-28 Thread 'Ali T (Cloud Platform Support)' via Google App Engine


It’s possible to do so. This would be like any other application that does 
some manipulation on data from BigQuery . Your application can be done 
using the client libraries[1] or through the rest API using extract 
property[2]. Once the application setup, you can use Cron jobs[3] to 
schedule your app to run every “n” time interval. 

Moreover, as outlined on the official exporting data from BigQuery 
documentation[4], this can also be done through Dataflow jobs which can 
also be scheduled to run automatically at a certain time interval. 

[1] 
https://cloud.google.com/bigquery/docs/exporting-data#exporting_table_data

[2] https://cloud.google.com/bigquery/docs/reference/rest/v2/jobs/insert

[3] https://cloud.google.com/appengine/docs/standard/python/config/cron

[4] https://cloud.google.com/bigquery/docs/exporting-data

-- 
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to google-appengine+unsubscr...@googlegroups.com.
To post to this group, send email to google-appengine@googlegroups.com.
Visit this group at https://groups.google.com/group/google-appengine.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/google-appengine/fd888e79-8687-4a82-9757-604d535a6919%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


[google-appengine] Re: help with finding something that was in my virgin mobile phone cloud . m

2018-07-30 Thread 'Ali T (Cloud Platform Support)' via Google App Engine


Hi Adam,

This issue is not related to Google Cloud Platform. I would suggest you 
contact your phone company or post this question on Quora 
 or Yahoo Answers 
.

On Monday, July 30, 2018 at 9:01:43 AM UTC-4, Adam Weaver wrote:
>
> my phone broke and i desperately need information that was on my cloud but 
> I dont know how to find it  can some one help me pls
>
>
>
>

-- 
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to google-appengine+unsubscr...@googlegroups.com.
To post to this group, send email to google-appengine@googlegroups.com.
Visit this group at https://groups.google.com/group/google-appengine.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/google-appengine/5794ff16-cc0b-4335-923b-3b04d9e037fc%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.