Inserting datastore projected queries in Postgresql using Apache Beam
up vote
0
down vote
favorite
I am trying to copy datastore entities to a PostgreSQL instance. As I don't need each fields, I performed projections following this snippet. I build the following query:
public static Query DatastoreQuery() {
Query.Builder query = Query.newBuilder();
// Add filter
query.addKindBuilder().setName("FOO");
query.setFilter(makeFilter("bar", PropertyFilter.Operator.EQUAL, makeValue("fuz")));
// Add projections
query.addProjection(Projection.newBuilder().setProperty(PropertyReference.newBuilder().setName("createdAt")));
return query.build();
}
This query is then used in the pipeline:
pipeline.apply(DatastoreIO.v1().read().withProjectId(options.getProjectId())
.withQuery(ExtractDatastore.DatastoreQuery()));
Following Unable to addProjection to relation field in envers query, I am expecting to get a Map<String, Object>
. I would like to follow Apache Beam documentation to insert the entities in PostgreSQL, using a code similar to:
.apply(JdbcIO.<Map<String, Object>>write()
.withDataSourceConfiguration(JdbcIO.DataSourceConfiguration.create(
"com.google.cloud.sql.postgres",
jdbcUrl
)
.withUsername(username)
.withPassword(password))
.withQuery("<INSERT QUERY>")
.withPreparedStatementSetter(<this is what needs to be filled>));
As I have found no examples, here are my questions are the following:
1) How to handle the elements from the query ? In the example of my projection, how do I get access to createdAt
which is a timestamp ? In python, I did it with : value.timestamp_value.ToDatetime()
. Is there an equivalent in Java ?
2) Are the entities really under the format <Map<String, Object>>
as described in the SO issue ?
2) Can one apply JdbcIO.write() to Map<String, Object>
or does it require to be under the <KV<KeyType, ValueType>>
format as described in the small snippet of the documentation ?
I am grateful to take any inputs on this topic.
java google-cloud-platform google-cloud-dataflow apache-beam
add a comment |
up vote
0
down vote
favorite
I am trying to copy datastore entities to a PostgreSQL instance. As I don't need each fields, I performed projections following this snippet. I build the following query:
public static Query DatastoreQuery() {
Query.Builder query = Query.newBuilder();
// Add filter
query.addKindBuilder().setName("FOO");
query.setFilter(makeFilter("bar", PropertyFilter.Operator.EQUAL, makeValue("fuz")));
// Add projections
query.addProjection(Projection.newBuilder().setProperty(PropertyReference.newBuilder().setName("createdAt")));
return query.build();
}
This query is then used in the pipeline:
pipeline.apply(DatastoreIO.v1().read().withProjectId(options.getProjectId())
.withQuery(ExtractDatastore.DatastoreQuery()));
Following Unable to addProjection to relation field in envers query, I am expecting to get a Map<String, Object>
. I would like to follow Apache Beam documentation to insert the entities in PostgreSQL, using a code similar to:
.apply(JdbcIO.<Map<String, Object>>write()
.withDataSourceConfiguration(JdbcIO.DataSourceConfiguration.create(
"com.google.cloud.sql.postgres",
jdbcUrl
)
.withUsername(username)
.withPassword(password))
.withQuery("<INSERT QUERY>")
.withPreparedStatementSetter(<this is what needs to be filled>));
As I have found no examples, here are my questions are the following:
1) How to handle the elements from the query ? In the example of my projection, how do I get access to createdAt
which is a timestamp ? In python, I did it with : value.timestamp_value.ToDatetime()
. Is there an equivalent in Java ?
2) Are the entities really under the format <Map<String, Object>>
as described in the SO issue ?
2) Can one apply JdbcIO.write() to Map<String, Object>
or does it require to be under the <KV<KeyType, ValueType>>
format as described in the small snippet of the documentation ?
I am grateful to take any inputs on this topic.
java google-cloud-platform google-cloud-dataflow apache-beam
add a comment |
up vote
0
down vote
favorite
up vote
0
down vote
favorite
I am trying to copy datastore entities to a PostgreSQL instance. As I don't need each fields, I performed projections following this snippet. I build the following query:
public static Query DatastoreQuery() {
Query.Builder query = Query.newBuilder();
// Add filter
query.addKindBuilder().setName("FOO");
query.setFilter(makeFilter("bar", PropertyFilter.Operator.EQUAL, makeValue("fuz")));
// Add projections
query.addProjection(Projection.newBuilder().setProperty(PropertyReference.newBuilder().setName("createdAt")));
return query.build();
}
This query is then used in the pipeline:
pipeline.apply(DatastoreIO.v1().read().withProjectId(options.getProjectId())
.withQuery(ExtractDatastore.DatastoreQuery()));
Following Unable to addProjection to relation field in envers query, I am expecting to get a Map<String, Object>
. I would like to follow Apache Beam documentation to insert the entities in PostgreSQL, using a code similar to:
.apply(JdbcIO.<Map<String, Object>>write()
.withDataSourceConfiguration(JdbcIO.DataSourceConfiguration.create(
"com.google.cloud.sql.postgres",
jdbcUrl
)
.withUsername(username)
.withPassword(password))
.withQuery("<INSERT QUERY>")
.withPreparedStatementSetter(<this is what needs to be filled>));
As I have found no examples, here are my questions are the following:
1) How to handle the elements from the query ? In the example of my projection, how do I get access to createdAt
which is a timestamp ? In python, I did it with : value.timestamp_value.ToDatetime()
. Is there an equivalent in Java ?
2) Are the entities really under the format <Map<String, Object>>
as described in the SO issue ?
2) Can one apply JdbcIO.write() to Map<String, Object>
or does it require to be under the <KV<KeyType, ValueType>>
format as described in the small snippet of the documentation ?
I am grateful to take any inputs on this topic.
java google-cloud-platform google-cloud-dataflow apache-beam
I am trying to copy datastore entities to a PostgreSQL instance. As I don't need each fields, I performed projections following this snippet. I build the following query:
public static Query DatastoreQuery() {
Query.Builder query = Query.newBuilder();
// Add filter
query.addKindBuilder().setName("FOO");
query.setFilter(makeFilter("bar", PropertyFilter.Operator.EQUAL, makeValue("fuz")));
// Add projections
query.addProjection(Projection.newBuilder().setProperty(PropertyReference.newBuilder().setName("createdAt")));
return query.build();
}
This query is then used in the pipeline:
pipeline.apply(DatastoreIO.v1().read().withProjectId(options.getProjectId())
.withQuery(ExtractDatastore.DatastoreQuery()));
Following Unable to addProjection to relation field in envers query, I am expecting to get a Map<String, Object>
. I would like to follow Apache Beam documentation to insert the entities in PostgreSQL, using a code similar to:
.apply(JdbcIO.<Map<String, Object>>write()
.withDataSourceConfiguration(JdbcIO.DataSourceConfiguration.create(
"com.google.cloud.sql.postgres",
jdbcUrl
)
.withUsername(username)
.withPassword(password))
.withQuery("<INSERT QUERY>")
.withPreparedStatementSetter(<this is what needs to be filled>));
As I have found no examples, here are my questions are the following:
1) How to handle the elements from the query ? In the example of my projection, how do I get access to createdAt
which is a timestamp ? In python, I did it with : value.timestamp_value.ToDatetime()
. Is there an equivalent in Java ?
2) Are the entities really under the format <Map<String, Object>>
as described in the SO issue ?
2) Can one apply JdbcIO.write() to Map<String, Object>
or does it require to be under the <KV<KeyType, ValueType>>
format as described in the small snippet of the documentation ?
I am grateful to take any inputs on this topic.
java google-cloud-platform google-cloud-dataflow apache-beam
java google-cloud-platform google-cloud-dataflow apache-beam
asked Nov 12 at 16:20
Dr Mouse
1191212
1191212
add a comment |
add a comment |
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53266180%2finserting-datastore-projected-queries-in-postgresql-using-apache-beam%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown