· 6 years ago · Jul 23, 2019, 05:38 PM
1Done. PASS=4 ERROR=0 SKIP=0 TOTAL=4
22019-07-11 12:32:56,662 (MainThread): Sending event: {'category': 'dbt', 'action': 'invocation', 'label': 'end', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x0000020276EF81D0>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x0000020276EDE390>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x0000020276EDE3C8>]}
32019-07-11 12:32:56,999 (MainThread): Flushing usage events
42019-07-11 14:19:35,203 (MainThread): Tracking: tracking
52019-07-11 14:19:35,206 (MainThread): Sending event: {'category': 'dbt', 'action': 'invocation', 'label': 'start', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x0000028CC648ACC0>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x0000028CC648A9B0>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x0000028CC648AD68>]}
62019-07-11 14:19:35,616 (MainThread): Parsing macros\optimizers\index.sql
72019-07-11 14:19:35,622 (MainThread): Parsing macros\core.sql
82019-07-11 14:19:35,627 (MainThread): Parsing macros\adapters\common.sql
92019-07-11 14:19:35,656 (MainThread): Parsing macros\etc\datetime.sql
102019-07-11 14:19:35,664 (MainThread): Parsing macros\etc\get_custom_alias.sql
112019-07-11 14:19:35,665 (MainThread): Parsing macros\etc\get_custom_schema.sql
122019-07-11 14:19:35,667 (MainThread): Parsing macros\etc\get_relation_comment.sql
132019-07-11 14:19:35,669 (MainThread): Parsing macros\etc\is_incremental.sql
142019-07-11 14:19:35,671 (MainThread): Parsing macros\etc\query.sql
152019-07-11 14:19:35,672 (MainThread): Parsing macros\materializations\helpers.sql
162019-07-11 14:19:35,679 (MainThread): Parsing macros\materializations\common\merge.sql
172019-07-11 14:19:35,685 (MainThread): Parsing macros\materializations\incremental\incremental.sql
182019-07-11 14:19:35,692 (MainThread): Parsing macros\materializations\seed\seed.sql
192019-07-11 14:19:35,706 (MainThread): Parsing macros\materializations\snapshot\snapshot.sql
202019-07-11 14:19:35,725 (MainThread): Parsing macros\materializations\snapshot\snapshot_merge.sql
212019-07-11 14:19:35,727 (MainThread): Parsing macros\materializations\snapshot\strategies.sql
222019-07-11 14:19:35,737 (MainThread): Parsing macros\materializations\table\table.sql
232019-07-11 14:19:35,742 (MainThread): Parsing macros\materializations\view\create_or_replace_view.sql
242019-07-11 14:19:35,746 (MainThread): Parsing macros\materializations\view\view.sql
252019-07-11 14:19:35,752 (MainThread): Parsing macros\schema_tests\accepted_values.sql
262019-07-11 14:19:35,753 (MainThread): Parsing macros\schema_tests\not_null.sql
272019-07-11 14:19:35,755 (MainThread): Parsing macros\schema_tests\relationships.sql
282019-07-11 14:19:35,756 (MainThread): Parsing macros\schema_tests\unique.sql
292019-07-11 14:19:35,758 (MainThread): Parsing macros\adapters.sql
302019-07-11 14:19:35,767 (MainThread): Parsing macros\catalog.sql
312019-07-11 14:19:35,769 (MainThread): Parsing macros\relations.sql
322019-07-11 14:19:35,771 (MainThread): Parsing macros\materializations\snapshot_merge.sql
332019-07-11 14:19:35,788 (MainThread): Parsing model.analytics_augmentation_package.anonymous_user_mappings
342019-07-11 14:19:35,789 (MainThread): Acquiring new postgres connection "anonymous_user_mappings".
352019-07-11 14:19:35,789 (MainThread): Opening a new connection, currently in state init
362019-07-11 14:19:36,211 (MainThread): Parsing model.analytics_augmentation_package.earliest_user_events
372019-07-11 14:19:36,211 (MainThread): Acquiring new postgres connection "earliest_user_events".
382019-07-11 14:19:36,212 (MainThread): Re-using an available connection from the pool (formerly anonymous_user_mappings).
392019-07-11 14:19:36,218 (MainThread): Parsing model.analytics_augmentation_package.real_time_utm
402019-07-11 14:19:36,219 (MainThread): Acquiring new postgres connection "real_time_utm".
412019-07-11 14:19:36,219 (MainThread): Re-using an available connection from the pool (formerly earliest_user_events).
422019-07-11 14:19:36,225 (MainThread): Parsing model.analytics_augmentation_package.unified_user_id
432019-07-11 14:19:36,226 (MainThread): Acquiring new postgres connection "unified_user_id".
442019-07-11 14:19:36,226 (MainThread): Re-using an available connection from the pool (formerly real_time_utm).
452019-07-11 14:19:36,268 (MainThread): Found 4 models, 0 tests, 0 snapshots, 0 analyses, 117 macros, 0 operations, 0 seed files, 0 sources
462019-07-11 14:19:36,269 (MainThread):
472019-07-11 14:19:36,270 (MainThread): Acquiring new postgres connection "master".
482019-07-11 14:19:36,270 (MainThread): Re-using an available connection from the pool (formerly unified_user_id).
492019-07-11 14:19:36,285 (MainThread): Parsing macros\core.sql
502019-07-11 14:19:36,290 (MainThread): Parsing macros\adapters\common.sql
512019-07-11 14:19:36,319 (MainThread): Parsing macros\etc\datetime.sql
522019-07-11 14:19:36,326 (MainThread): Parsing macros\etc\get_custom_alias.sql
532019-07-11 14:19:36,327 (MainThread): Parsing macros\etc\get_custom_schema.sql
542019-07-11 14:19:36,329 (MainThread): Parsing macros\etc\get_relation_comment.sql
552019-07-11 14:19:36,332 (MainThread): Parsing macros\etc\is_incremental.sql
562019-07-11 14:19:36,333 (MainThread): Parsing macros\etc\query.sql
572019-07-11 14:19:36,335 (MainThread): Parsing macros\materializations\helpers.sql
582019-07-11 14:19:36,356 (MainThread): Parsing macros\materializations\common\merge.sql
592019-07-11 14:19:36,362 (MainThread): Parsing macros\materializations\incremental\incremental.sql
602019-07-11 14:19:36,368 (MainThread): Parsing macros\materializations\seed\seed.sql
612019-07-11 14:19:36,380 (MainThread): Parsing macros\materializations\snapshot\snapshot.sql
622019-07-11 14:19:36,396 (MainThread): Parsing macros\materializations\snapshot\snapshot_merge.sql
632019-07-11 14:19:36,398 (MainThread): Parsing macros\materializations\snapshot\strategies.sql
642019-07-11 14:19:36,408 (MainThread): Parsing macros\materializations\table\table.sql
652019-07-11 14:19:36,413 (MainThread): Parsing macros\materializations\view\create_or_replace_view.sql
662019-07-11 14:19:36,419 (MainThread): Parsing macros\materializations\view\view.sql
672019-07-11 14:19:36,424 (MainThread): Parsing macros\schema_tests\accepted_values.sql
682019-07-11 14:19:36,426 (MainThread): Parsing macros\schema_tests\not_null.sql
692019-07-11 14:19:36,427 (MainThread): Parsing macros\schema_tests\relationships.sql
702019-07-11 14:19:36,428 (MainThread): Parsing macros\schema_tests\unique.sql
712019-07-11 14:19:36,430 (MainThread): Parsing macros\adapters.sql
722019-07-11 14:19:36,439 (MainThread): Parsing macros\catalog.sql
732019-07-11 14:19:36,441 (MainThread): Parsing macros\relations.sql
742019-07-11 14:19:36,443 (MainThread): Parsing macros\materializations\snapshot_merge.sql
752019-07-11 14:19:36,542 (MainThread): Using postgres connection "master".
762019-07-11 14:19:36,542 (MainThread): On master:
77 select distinct nspname from pg_namespace
78
792019-07-11 14:19:36,569 (MainThread): SQL status: SELECT 32 in 0.03 seconds
802019-07-11 14:19:36,580 (MainThread): Using postgres connection "master".
812019-07-11 14:19:36,580 (MainThread): On master: BEGIN
822019-07-11 14:19:36,593 (MainThread): SQL status: BEGIN in 0.01 seconds
832019-07-11 14:19:36,593 (MainThread): Using postgres connection "master".
842019-07-11 14:19:36,593 (MainThread): On master: select
85 'analyticsd' as database,
86 tablename as name,
87 schemaname as schema,
88 'table' as type
89 from pg_tables
90 where schemaname ilike 'public'
91 union all
92 select
93 'analyticsd' as database,
94 viewname as name,
95 schemaname as schema,
96 'view' as type
97 from pg_views
98 where schemaname ilike 'public'
99
1002019-07-11 14:19:36,607 (MainThread): SQL status: SELECT 10 in 0.01 seconds
1012019-07-11 14:19:36,623 (MainThread): Using postgres connection "master".
1022019-07-11 14:19:36,623 (MainThread): On master: --
103 --
104 with relation as (
105 select
106 pg_rewrite.ev_class as class,
107 pg_rewrite.oid as id
108 from pg_rewrite
109 ),
110 class as (
111 select
112 oid as id,
113 relname as name,
114 relnamespace as schema,
115 relkind as kind
116 from pg_class
117 ),
118 dependency as (
119 select
120 pg_depend.objid as id,
121 pg_depend.refobjid as ref
122 from pg_depend
123 ),
124 schema as (
125 select
126 pg_namespace.oid as id,
127 pg_namespace.nspname as name
128 from pg_namespace
129 where nspname != 'information_schema' and nspname not like 'pg_%'
130 ),
131 referenced as (
132 select
133 relation.id AS id,
134 referenced_class.name ,
135 referenced_class.schema ,
136 referenced_class.kind
137 from relation
138 join class as referenced_class on relation.class=referenced_class.id
139 where referenced_class.kind in ('r', 'v')
140 ),
141 relationships as (
142 select
143 referenced.name as referenced_name,
144 referenced.schema as referenced_schema_id,
145 dependent_class.name as dependent_name,
146 dependent_class.schema as dependent_schema_id,
147 referenced.kind as kind
148 from referenced
149 join dependency on referenced.id=dependency.id
150 join class as dependent_class on dependency.ref=dependent_class.id
151 where
152 (referenced.name != dependent_class.name or
153 referenced.schema != dependent_class.schema)
154 )
155
156 select
157 referenced_schema.name as referenced_schema,
158 relationships.referenced_name as referenced_name,
159 dependent_schema.name as dependent_schema,
160 relationships.dependent_name as dependent_name
161 from relationships
162 join schema as dependent_schema on relationships.dependent_schema_id=dependent_schema.id
163 join schema as referenced_schema on relationships.referenced_schema_id=referenced_schema.id
164 group by referenced_schema, referenced_name, dependent_schema, dependent_name
165 order by referenced_schema, referenced_name, dependent_schema, dependent_name;
1662019-07-11 14:19:36,642 (MainThread): SQL status: SELECT 0 in 0.02 seconds
1672019-07-11 14:19:36,642 (MainThread): On master: ROLLBACK
1682019-07-11 14:19:36,656 (MainThread): Using postgres connection "master".
1692019-07-11 14:19:36,656 (MainThread): On master: BEGIN
1702019-07-11 14:19:36,684 (MainThread): SQL status: BEGIN in 0.03 seconds
1712019-07-11 14:19:36,684 (MainThread): On master: COMMIT
1722019-07-11 14:19:36,684 (MainThread): Using postgres connection "master".
1732019-07-11 14:19:36,684 (MainThread): On master: COMMIT
1742019-07-11 14:19:36,700 (MainThread): SQL status: COMMIT in 0.02 seconds
1752019-07-11 14:19:36,700 (MainThread): 14:19:36 | Concurrency: 5 threads (target='dev')
1762019-07-11 14:19:36,701 (MainThread): 14:19:36 |
1772019-07-11 14:19:36,709 (Thread-1): 14:19:36 | 1 of 4 START incremental model public.anonymous_user_mappings........ [RUN]
1782019-07-11 14:19:36,711 (Thread-1): Acquiring new postgres connection "anonymous_user_mappings".
1792019-07-11 14:19:36,709 (Thread-2): 14:19:36 | 2 of 4 START incremental model public.earliest_user_events........... [RUN]
1802019-07-11 14:19:36,711 (Thread-1): Opening a new connection, currently in state init
1812019-07-11 14:19:36,712 (Thread-2): Acquiring new postgres connection "earliest_user_events".
1822019-07-11 14:19:36,714 (Thread-2): Opening a new connection, currently in state init
1832019-07-11 14:19:36,710 (Thread-3): 14:19:36 | 3 of 4 START incremental model public.real_time_utm.................. [RUN]
1842019-07-11 14:19:36,716 (Thread-3): Acquiring new postgres connection "real_time_utm".
1852019-07-11 14:19:36,710 (Thread-4): 14:19:36 | 4 of 4 START incremental model public.unified_user_id................ [RUN]
1862019-07-11 14:19:36,716 (Thread-3): Opening a new connection, currently in state init
1872019-07-11 14:19:36,717 (Thread-4): Acquiring new postgres connection "unified_user_id".
1882019-07-11 14:19:36,718 (Thread-4): Opening a new connection, currently in state init
1892019-07-11 14:19:36,801 (Thread-1): Compiling model.analytics_augmentation_package.anonymous_user_mappings
1902019-07-11 14:19:36,808 (Thread-3): Compiling model.analytics_augmentation_package.real_time_utm
1912019-07-11 14:19:36,809 (Thread-1): Writing injected SQL for node "model.analytics_augmentation_package.anonymous_user_mappings"
1922019-07-11 14:19:36,821 (Thread-4): Compiling model.analytics_augmentation_package.unified_user_id
1932019-07-11 14:19:36,827 (Thread-3): Writing injected SQL for node "model.analytics_augmentation_package.real_time_utm"
1942019-07-11 14:19:36,832 (Thread-4): Writing injected SQL for node "model.analytics_augmentation_package.unified_user_id"
1952019-07-11 14:19:45,719 (MainThread): Tracking: tracking
1962019-07-11 14:19:45,723 (MainThread): Sending event: {'category': 'dbt', 'action': 'invocation', 'label': 'start', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x00000217D10A29E8>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x00000217D0418198>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x00000217D10B0198>]}
1972019-07-11 14:19:46,122 (MainThread): Parsing macros\optimizers\index.sql
1982019-07-11 14:19:46,128 (MainThread): Parsing macros\core.sql
1992019-07-11 14:19:46,134 (MainThread): Parsing macros\adapters\common.sql
2002019-07-11 14:19:46,176 (MainThread): Parsing macros\etc\datetime.sql
2012019-07-11 14:19:46,186 (MainThread): Parsing macros\etc\get_custom_alias.sql
2022019-07-11 14:19:46,187 (MainThread): Parsing macros\etc\get_custom_schema.sql
2032019-07-11 14:19:46,191 (MainThread): Parsing macros\etc\get_relation_comment.sql
2042019-07-11 14:19:46,196 (MainThread): Parsing macros\etc\is_incremental.sql
2052019-07-11 14:19:46,198 (MainThread): Parsing macros\etc\query.sql
2062019-07-11 14:19:46,200 (MainThread): Parsing macros\materializations\helpers.sql
2072019-07-11 14:19:46,208 (MainThread): Parsing macros\materializations\common\merge.sql
2082019-07-11 14:19:46,216 (MainThread): Parsing macros\materializations\incremental\incremental.sql
2092019-07-11 14:19:46,224 (MainThread): Parsing macros\materializations\seed\seed.sql
2102019-07-11 14:19:46,241 (MainThread): Parsing macros\materializations\snapshot\snapshot.sql
2112019-07-11 14:19:46,263 (MainThread): Parsing macros\materializations\snapshot\snapshot_merge.sql
2122019-07-11 14:19:46,265 (MainThread): Parsing macros\materializations\snapshot\strategies.sql
2132019-07-11 14:19:46,276 (MainThread): Parsing macros\materializations\table\table.sql
2142019-07-11 14:19:46,281 (MainThread): Parsing macros\materializations\view\create_or_replace_view.sql
2152019-07-11 14:19:46,285 (MainThread): Parsing macros\materializations\view\view.sql
2162019-07-11 14:19:46,291 (MainThread): Parsing macros\schema_tests\accepted_values.sql
2172019-07-11 14:19:46,293 (MainThread): Parsing macros\schema_tests\not_null.sql
2182019-07-11 14:19:46,295 (MainThread): Parsing macros\schema_tests\relationships.sql
2192019-07-11 14:19:46,296 (MainThread): Parsing macros\schema_tests\unique.sql
2202019-07-11 14:19:46,298 (MainThread): Parsing macros\adapters.sql
2212019-07-11 14:19:46,308 (MainThread): Parsing macros\catalog.sql
2222019-07-11 14:19:46,311 (MainThread): Parsing macros\relations.sql
2232019-07-11 14:19:46,314 (MainThread): Parsing macros\materializations\snapshot_merge.sql
2242019-07-11 14:19:46,333 (MainThread): Parsing model.analytics_augmentation_package.anonymous_user_mappings
2252019-07-11 14:19:46,334 (MainThread): Acquiring new postgres connection "anonymous_user_mappings".
2262019-07-11 14:19:46,334 (MainThread): Opening a new connection, currently in state init
2272019-07-11 14:19:46,707 (MainThread): Parsing model.analytics_augmentation_package.earliest_user_events
2282019-07-11 14:19:46,707 (MainThread): Acquiring new postgres connection "earliest_user_events".
2292019-07-11 14:19:46,707 (MainThread): Re-using an available connection from the pool (formerly anonymous_user_mappings).
2302019-07-11 14:19:46,714 (MainThread): Parsing model.analytics_augmentation_package.real_time_utm
2312019-07-11 14:19:46,715 (MainThread): Acquiring new postgres connection "real_time_utm".
2322019-07-11 14:19:46,715 (MainThread): Re-using an available connection from the pool (formerly earliest_user_events).
2332019-07-11 14:19:46,719 (MainThread): Parsing model.analytics_augmentation_package.unified_user_id
2342019-07-11 14:19:46,720 (MainThread): Acquiring new postgres connection "unified_user_id".
2352019-07-11 14:19:46,720 (MainThread): Re-using an available connection from the pool (formerly real_time_utm).
2362019-07-11 14:19:46,753 (MainThread): Found 4 models, 0 tests, 0 snapshots, 0 analyses, 117 macros, 0 operations, 0 seed files, 0 sources
2372019-07-11 14:19:46,754 (MainThread):
2382019-07-11 14:19:46,756 (MainThread): Acquiring new postgres connection "master".
2392019-07-11 14:19:46,756 (MainThread): Re-using an available connection from the pool (formerly unified_user_id).
2402019-07-11 14:19:46,773 (MainThread): Parsing macros\core.sql
2412019-07-11 14:19:46,777 (MainThread): Parsing macros\adapters\common.sql
2422019-07-11 14:19:46,806 (MainThread): Parsing macros\etc\datetime.sql
2432019-07-11 14:19:46,813 (MainThread): Parsing macros\etc\get_custom_alias.sql
2442019-07-11 14:19:46,814 (MainThread): Parsing macros\etc\get_custom_schema.sql
2452019-07-11 14:19:46,815 (MainThread): Parsing macros\etc\get_relation_comment.sql
2462019-07-11 14:19:46,817 (MainThread): Parsing macros\etc\is_incremental.sql
2472019-07-11 14:19:46,819 (MainThread): Parsing macros\etc\query.sql
2482019-07-11 14:19:46,820 (MainThread): Parsing macros\materializations\helpers.sql
2492019-07-11 14:19:46,842 (MainThread): Parsing macros\materializations\common\merge.sql
2502019-07-11 14:19:46,851 (MainThread): Parsing macros\materializations\incremental\incremental.sql
2512019-07-11 14:19:46,858 (MainThread): Parsing macros\materializations\seed\seed.sql
2522019-07-11 14:19:46,871 (MainThread): Parsing macros\materializations\snapshot\snapshot.sql
2532019-07-11 14:19:46,887 (MainThread): Parsing macros\materializations\snapshot\snapshot_merge.sql
2542019-07-11 14:19:46,889 (MainThread): Parsing macros\materializations\snapshot\strategies.sql
2552019-07-11 14:19:46,899 (MainThread): Parsing macros\materializations\table\table.sql
2562019-07-11 14:19:46,904 (MainThread): Parsing macros\materializations\view\create_or_replace_view.sql
2572019-07-11 14:19:46,908 (MainThread): Parsing macros\materializations\view\view.sql
2582019-07-11 14:19:46,913 (MainThread): Parsing macros\schema_tests\accepted_values.sql
2592019-07-11 14:19:46,915 (MainThread): Parsing macros\schema_tests\not_null.sql
2602019-07-11 14:19:46,916 (MainThread): Parsing macros\schema_tests\relationships.sql
2612019-07-11 14:19:46,918 (MainThread): Parsing macros\schema_tests\unique.sql
2622019-07-11 14:19:46,920 (MainThread): Parsing macros\adapters.sql
2632019-07-11 14:19:46,929 (MainThread): Parsing macros\catalog.sql
2642019-07-11 14:19:46,931 (MainThread): Parsing macros\relations.sql
2652019-07-11 14:19:46,933 (MainThread): Parsing macros\materializations\snapshot_merge.sql
2662019-07-11 14:19:47,035 (MainThread): Using postgres connection "master".
2672019-07-11 14:19:47,036 (MainThread): On master:
268 select distinct nspname from pg_namespace
269
2702019-07-11 14:19:47,065 (MainThread): SQL status: SELECT 32 in 0.03 seconds
2712019-07-11 14:19:47,077 (MainThread): Using postgres connection "master".
2722019-07-11 14:19:47,077 (MainThread): On master: BEGIN
2732019-07-11 14:19:47,089 (MainThread): SQL status: BEGIN in 0.01 seconds
2742019-07-11 14:19:47,096 (MainThread): Using postgres connection "master".
2752019-07-11 14:19:47,096 (MainThread): On master: select
276 'analyticsd' as database,
277 tablename as name,
278 schemaname as schema,
279 'table' as type
280 from pg_tables
281 where schemaname ilike 'public'
282 union all
283 select
284 'analyticsd' as database,
285 viewname as name,
286 schemaname as schema,
287 'view' as type
288 from pg_views
289 where schemaname ilike 'public'
290
2912019-07-11 14:19:47,112 (MainThread): SQL status: SELECT 10 in 0.02 seconds
2922019-07-11 14:19:47,126 (MainThread): Using postgres connection "master".
2932019-07-11 14:19:47,126 (MainThread): On master: --
294 --
295 with relation as (
296 select
297 pg_rewrite.ev_class as class,
298 pg_rewrite.oid as id
299 from pg_rewrite
300 ),
301 class as (
302 select
303 oid as id,
304 relname as name,
305 relnamespace as schema,
306 relkind as kind
307 from pg_class
308 ),
309 dependency as (
310 select
311 pg_depend.objid as id,
312 pg_depend.refobjid as ref
313 from pg_depend
314 ),
315 schema as (
316 select
317 pg_namespace.oid as id,
318 pg_namespace.nspname as name
319 from pg_namespace
320 where nspname != 'information_schema' and nspname not like 'pg_%'
321 ),
322 referenced as (
323 select
324 relation.id AS id,
325 referenced_class.name ,
326 referenced_class.schema ,
327 referenced_class.kind
328 from relation
329 join class as referenced_class on relation.class=referenced_class.id
330 where referenced_class.kind in ('r', 'v')
331 ),
332 relationships as (
333 select
334 referenced.name as referenced_name,
335 referenced.schema as referenced_schema_id,
336 dependent_class.name as dependent_name,
337 dependent_class.schema as dependent_schema_id,
338 referenced.kind as kind
339 from referenced
340 join dependency on referenced.id=dependency.id
341 join class as dependent_class on dependency.ref=dependent_class.id
342 where
343 (referenced.name != dependent_class.name or
344 referenced.schema != dependent_class.schema)
345 )
346
347 select
348 referenced_schema.name as referenced_schema,
349 relationships.referenced_name as referenced_name,
350 dependent_schema.name as dependent_schema,
351 relationships.dependent_name as dependent_name
352 from relationships
353 join schema as dependent_schema on relationships.dependent_schema_id=dependent_schema.id
354 join schema as referenced_schema on relationships.referenced_schema_id=referenced_schema.id
355 group by referenced_schema, referenced_name, dependent_schema, dependent_name
356 order by referenced_schema, referenced_name, dependent_schema, dependent_name;
3572019-07-11 14:19:47,145 (MainThread): SQL status: SELECT 0 in 0.02 seconds
3582019-07-11 14:19:47,146 (MainThread): On master: ROLLBACK
3592019-07-11 14:19:47,159 (MainThread): Using postgres connection "master".
3602019-07-11 14:19:47,159 (MainThread): On master: BEGIN
3612019-07-11 14:19:47,188 (MainThread): SQL status: BEGIN in 0.03 seconds
3622019-07-11 14:19:47,189 (MainThread): On master: COMMIT
3632019-07-11 14:19:47,189 (MainThread): Using postgres connection "master".
3642019-07-11 14:19:47,189 (MainThread): On master: COMMIT
3652019-07-11 14:19:47,201 (MainThread): SQL status: COMMIT in 0.01 seconds
3662019-07-11 14:19:47,201 (MainThread): 14:19:47 | Concurrency: 5 threads (target='dev')
3672019-07-11 14:19:47,201 (MainThread): 14:19:47 |
3682019-07-11 14:19:47,206 (Thread-1): 14:19:47 | 1 of 4 START incremental model public.anonymous_user_mappings........ [RUN]
3692019-07-11 14:19:47,207 (Thread-1): Acquiring new postgres connection "anonymous_user_mappings".
3702019-07-11 14:19:47,207 (Thread-1): Opening a new connection, currently in state init
3712019-07-11 14:19:47,206 (Thread-2): 14:19:47 | 2 of 4 START incremental model public.earliest_user_events........... [RUN]
3722019-07-11 14:19:47,209 (Thread-2): Acquiring new postgres connection "earliest_user_events".
3732019-07-11 14:19:47,208 (Thread-3): 14:19:47 | 3 of 4 START incremental model public.real_time_utm.................. [RUN]
3742019-07-11 14:19:47,209 (Thread-2): Opening a new connection, currently in state init
3752019-07-11 14:19:47,209 (Thread-3): Acquiring new postgres connection "real_time_utm".
3762019-07-11 14:19:47,208 (Thread-4): 14:19:47 | 4 of 4 START incremental model public.unified_user_id................ [RUN]
3772019-07-11 14:19:47,210 (Thread-3): Opening a new connection, currently in state init
3782019-07-11 14:19:47,210 (Thread-4): Acquiring new postgres connection "unified_user_id".
3792019-07-11 14:19:47,211 (Thread-4): Opening a new connection, currently in state init
3802019-07-11 14:19:47,293 (Thread-1): Compiling model.analytics_augmentation_package.anonymous_user_mappings
3812019-07-11 14:19:47,299 (Thread-1): Writing injected SQL for node "model.analytics_augmentation_package.anonymous_user_mappings"
3822019-07-11 14:19:47,300 (Thread-4): Compiling model.analytics_augmentation_package.unified_user_id
3832019-07-11 14:19:47,306 (Thread-4): Writing injected SQL for node "model.analytics_augmentation_package.unified_user_id"
3842019-07-11 14:19:47,307 (Thread-2): Compiling model.analytics_augmentation_package.earliest_user_events
3852019-07-11 14:19:47,314 (Thread-2): Writing injected SQL for node "model.analytics_augmentation_package.earliest_user_events"
3862019-07-11 14:19:47,314 (Thread-3): Compiling model.analytics_augmentation_package.real_time_utm
3872019-07-11 14:19:47,337 (Thread-3): Writing injected SQL for node "model.analytics_augmentation_package.real_time_utm"
3882019-07-11 14:19:47,405 (Thread-3): Using postgres connection "real_time_utm".
3892019-07-11 14:19:47,405 (Thread-3): On real_time_utm: BEGIN
3902019-07-11 14:19:47,410 (Thread-2): Using postgres connection "earliest_user_events".
3912019-07-11 14:19:47,410 (Thread-2): On earliest_user_events: BEGIN
3922019-07-11 14:19:47,425 (Thread-4): Using postgres connection "unified_user_id".
3932019-07-11 14:19:47,425 (Thread-4): On unified_user_id: BEGIN
3942019-07-11 14:19:47,429 (Thread-1): Using postgres connection "anonymous_user_mappings".
3952019-07-11 14:19:47,430 (Thread-1): On anonymous_user_mappings: BEGIN
3962019-07-11 14:19:47,432 (Thread-3): SQL status: BEGIN in 0.03 seconds
3972019-07-11 14:19:47,432 (Thread-3): Using postgres connection "real_time_utm".
3982019-07-11 14:19:47,432 (Thread-3): On real_time_utm: create temporary table
399 "real_time_utm__dbt_tmp20190711141947380821"
400 as (
401 -- Welcome to your first dbt model!
402-- Did you know that you can also configure models directly within
403-- the SQL file? This will override configurations stated in dbt_project.yml
404
405-- Try changing 'view' to 'table', then re-running dbt
406
407
408WITH
409 extract_url as (SELECT *, urldecode_arr(ae.context->>'url') as url FROM public.analytics_event ae)
410 , realtime_UTM AS (
411 SELECT CASE
412 WHEN url LIKE '%utm_medium%' THEN SPLIT_PART(SPLIT_PART(decode_url_part(url),'utm_medium=',2),'&',1)
413 ELSE NULL
414 END AS utm_medium,
415 CASE
416 WHEN url LIKE '%utm_source%' THEN SPLIT_PART(SPLIT_PART(decode_url_part(url),'utm_source=',2),'&',1)
417 ELSE NULL
418 END AS utm_source,
419 CASE
420 WHEN url LIKE '%utm_campaign%' THEN SPLIT_PART(SPLIT_PART(decode_url_part(url),'utm_campaign=',2),'&',1)
421 ELSE NULL
422 END AS utm_campaign,
423 CASE
424 WHEN url LIKE '%utm_content%' THEN SPLIT_PART(SPLIT_PART(decode_url_part(url),'utm_content=',2),'&',1)
425 ELSE NULL
426 END AS utm_content
427 , *
428 FROM extract_url
429 )
430SELECT realtime_UTM.id, realtime_UTM.utm_source, realtime_UTM.utm_campaign, realtime_UTM.utm_medium, realtime_UTM.utm_content
431, realtime_UTM.handle_time
432FROM realtime_UTM
433WHERE NOT(realtime_UTM.utm_medium IS NULL AND realtime_UTM.utm_source IS NULL AND realtime_UTM.utm_campaign IS NULL AND realtime_UTM.utm_content IS NULL)
434
435 -- this filter will only be applied on an incremental run
436 AND realtime_UTM.handle_time >= (select max(handle_time) from "analyticsd"."public"."real_time_utm")
437
438 );
4392019-07-11 14:19:47,432 (Thread-2): SQL status: BEGIN in 0.02 seconds
4402019-07-11 14:19:47,432 (Thread-2): Using postgres connection "earliest_user_events".
4412019-07-11 14:19:47,432 (Thread-2): On earliest_user_events: create temporary table
442 "earliest_user_events__dbt_tmp20190711141947408781"
443 as (
444 -- Welcome to your first dbt model!
445-- Did you know that you can also configure models directly within
446-- the SQL file? This will override configurations stated in dbt_project.yml
447
448-- Try changing 'view' to 'table', then re-running dbt
449
450
451SELECT DISTINCT s.id, uui.user_id, uui.type, s.handle_time
452FROM public.unified_user_id uui
453 INNER JOIN LATERAL
454 (
455 SELECT id, handle_time
456 FROM public.unified_user_id uui2
457 WHERE 1=1
458 AND uui2.user_id = uui.user_id
459 AND uui2.type = uui.type
460 -- this filter will only be applied on an incremental run
461 AND uui2.handle_time >= (select max(handle_time) from "analyticsd"."public"."earliest_user_events")
462
463 ORDER BY uui2.handle_time LIMIT 1
464 ) s ON TRUE
465 -- this join will only be applied on an incremental run
466 LEFT OUTER JOIN "analyticsd"."public"."earliest_user_events" eue ON eue.user_id = uui.user_id AND eue.type = uui.type
467
468WHERE 1=1
469 AND uui.user_id IS NOT NULL
470 -- this join will only be applied on an incremental run
471 AND eue.user_id IS NULL
472
473 );
4742019-07-11 14:19:47,449 (Thread-4): SQL status: BEGIN in 0.02 seconds
4752019-07-11 14:19:47,449 (Thread-4): Using postgres connection "unified_user_id".
4762019-07-11 14:19:47,449 (Thread-4): On unified_user_id: create temporary table
477 "unified_user_id__dbt_tmp20190711141947373838"
478 as (
479 -- Welcome to your first dbt model!
480-- Did you know that you can also configure models directly within
481-- the SQL file? This will override configurations stated in dbt_project.yml
482
483-- Try changing 'view' to 'table', then re-running dbt
484
485
486
487SELECT id,
488 CASE
489 WHEN (caller_id = target_user_id) THEN caller_id
490 WHEN ae.type IN (
491 'client.pageTime'
492 , 'auth.CredRemoved'
493 , 'client.buttonClicked'
494 , 'client.clientIdentified'
495 , 'client.linkClicked'
496 , 'client.pageViewed'
497 , 'contractd.AwardCreated'
498 , 'questd.DraftStageCompleted'
499 , 'questd.ExportReviewCompleted'
500 , 'questd.InitialReviewCompleted'
501 , 'questd.LitReviewCompleted'
502 , 'questd.PrescreenCompleted'
503 , 'questd.Proceed'
504 , 'questd.QuestCreated'
505 , 'questd.QuestUpdated'
506 , 'questd.StartStage'
507 , 'questd.TeamUpdated'
508 , 'reviewd.ExportPassed'
509 , 'reviewd.FollowupCreated'
510 , 'reviewd.FollowupUpdated'
511 , 'reviewd.IDMFail'
512 , 'reviewd.InitialPassed'
513 , 'reviewd.InternalCompleted'
514 , 'reviewd.InternalFailed'
515 , 'reviewd.LitCompleted'
516 , 'reviewd.PreScreenFailed'
517 , 'reviewd.PreScreenPassed'
518 , 'reviewd.ReviewCreated'
519 , 'reviewd.ReviewUpdated'
520 , 'search.Search'
521 , 'solutiond.SolutionAssetAdded'
522 , 'solutiond.SolutionCreated'
523 , 'solutiond.SolutionDeleted'
524 , 'solutiond.SolutionUpdated'
525 , 'userd.DisclosureAccepted'
526 , 'userd.EntradaDeleted'
527 , 'userd.InvitationCreated'
528 , 'userd.UserUpdated'
529 ) THEN caller_id
530 WHEN ae.type IN (
531 'auth.CredRegistered'
532 , 'auth.EmailVerified'
533 , 'auth.PasswordReset'
534 , 'auth.PasswordResetInitiated'
535 , 'auth.UserCreated'
536 , 'auth.UserSignedIn'
537 , 'auth.UserSignedOut'
538 , 'contractd.ContractCreated'
539 , 'referrald.ReferralContractCreated'
540 , 'referrald.ReferralCreated'
541 , 'referrald.ReferralInviteCreated'
542 , 'userd.EntradaCreated'
543 , 'userd.EntradaShareUpdated'
544 , 'userd.UserCreated'
545 , 'userd.UserIDMMatch'
546 , 'userd.UserPicUpdated'
547 ) THEN target_user_id
548 ELSE COALESCE(caller_id,target_user_id)
549 END as user_id
550 , type, handle_time
551FROM public.analytics_event ae
552WHERE 1=1
553
554 -- this filter will only be applied on an incremental run
555 AND handle_time >= (select max(handle_time) from "analyticsd"."public"."unified_user_id")
556
557AND NOT(caller_id IS NULL AND target_user_id IS NULL)
558 );
5592019-07-11 14:19:47,454 (Thread-1): SQL status: BEGIN in 0.02 seconds
5602019-07-11 14:19:47,455 (Thread-1): Using postgres connection "anonymous_user_mappings".
5612019-07-11 14:19:47,455 (Thread-1): On anonymous_user_mappings: create temporary table
562 "anonymous_user_mappings__dbt_tmp20190711141947397833"
563 as (
564 -- defining unique_key='an_id' means that we'll never have more than 1 rows for a given an_id
565
566
567SELECT DISTINCT ae.an_id, uui.user_id
568FROM public.analytics_event ae
569 INNER JOIN public.unified_user_id uui ON uui.id = ae.id
570 INNER JOIN
571 (
572 SELECT ae2.an_id
573 FROM public.analytics_event ae2
574 INNER JOIN public.unified_user_id uui ON uui.id = ae2.id
575 GROUP BY ae2.an_id
576 HAVING COUNT(DISTINCT uui.user_id) = 1
577 ) as unique_records ON unique_records.an_id = ae.an_id
578 -- this join will only be applied on an incremental run
579 LEFT OUTER JOIN "analyticsd"."public"."anonymous_user_mappings" aum ON aum.an_id = ae.an_id
580
581WHERE 1=1
582 AND ae.an_id IS NOT NULL
583 -- this guarantees that we don't refresh or add an_id values we already have
584 AND aum.an_id IS NULL
585
586 );
5872019-07-11 14:19:47,478 (Thread-4): SQL status: SELECT 743 in 0.03 seconds
5882019-07-11 14:19:47,482 (Thread-4): Using postgres connection "unified_user_id".
5892019-07-11 14:19:47,482 (Thread-4): On unified_user_id:
590 select
591 column_name,
592 data_type,
593 character_maximum_length,
594 numeric_precision,
595 numeric_scale
596
597 from information_schema.columns
598 where table_name = 'unified_user_id__dbt_tmp20190711141947373838'
599
600 order by ordinal_position
601
602
6032019-07-11 14:19:47,508 (Thread-4): SQL status: SELECT 4 in 0.03 seconds
6042019-07-11 14:19:47,513 (Thread-4): Using postgres connection "unified_user_id".
6052019-07-11 14:19:47,513 (Thread-4): On unified_user_id:
606 select
607 column_name,
608 data_type,
609 character_maximum_length,
610 numeric_precision,
611 numeric_scale
612
613 from "analyticsd".information_schema.columns
614 where table_name = 'unified_user_id'
615
616 and table_schema = 'public'
617
618 order by ordinal_position
619
620
6212019-07-11 14:19:47,530 (Thread-4): SQL status: SELECT 4 in 0.02 seconds
6222019-07-11 14:19:47,535 (Thread-4): Using postgres connection "unified_user_id".
6232019-07-11 14:19:47,536 (Thread-4): On unified_user_id:
624 select
625 column_name,
626 data_type,
627 character_maximum_length,
628 numeric_precision,
629 numeric_scale
630
631 from "analyticsd".information_schema.columns
632 where table_name = 'unified_user_id'
633
634 and table_schema = 'public'
635
636 order by ordinal_position
637
638
6392019-07-11 14:19:47,551 (Thread-4): SQL status: SELECT 4 in 0.02 seconds
6402019-07-11 14:19:47,555 (Thread-4): Writing runtime SQL for node "model.analytics_augmentation_package.unified_user_id"
6412019-07-11 14:19:47,556 (Thread-4): Using postgres connection "unified_user_id".
6422019-07-11 14:19:47,556 (Thread-4): On unified_user_id:
643
644
645 delete
646 from "analyticsd"."public"."unified_user_id"
647 where (id) in (
648 select (id)
649 from "unified_user_id__dbt_tmp20190711141947373838"
650 );
651
652 insert into "analyticsd"."public"."unified_user_id" ("id", "user_id", "type", "handle_time")
653 (
654 select "id", "user_id", "type", "handle_time"
655 from "unified_user_id__dbt_tmp20190711141947373838"
656 );
657
6582019-07-11 14:19:47,563 (Thread-2): SQL status: SELECT 111 in 0.13 seconds
6592019-07-11 14:19:47,570 (Thread-2): Using postgres connection "earliest_user_events".
6602019-07-11 14:19:47,570 (Thread-2): On earliest_user_events:
661 select
662 column_name,
663 data_type,
664 character_maximum_length,
665 numeric_precision,
666 numeric_scale
667
668 from information_schema.columns
669 where table_name = 'earliest_user_events__dbt_tmp20190711141947408781'
670
671 order by ordinal_position
672
673
6742019-07-11 14:19:47,590 (Thread-2): SQL status: SELECT 4 in 0.02 seconds
6752019-07-11 14:19:47,595 (Thread-2): Using postgres connection "earliest_user_events".
6762019-07-11 14:19:47,595 (Thread-2): On earliest_user_events:
677 select
678 column_name,
679 data_type,
680 character_maximum_length,
681 numeric_precision,
682 numeric_scale
683
684 from "analyticsd".information_schema.columns
685 where table_name = 'earliest_user_events'
686
687 and table_schema = 'public'
688
689 order by ordinal_position
690
691
6922019-07-11 14:19:47,611 (Thread-2): SQL status: SELECT 4 in 0.02 seconds
6932019-07-11 14:19:47,614 (Thread-2): Using postgres connection "earliest_user_events".
6942019-07-11 14:19:47,614 (Thread-2): On earliest_user_events:
695 select
696 column_name,
697 data_type,
698 character_maximum_length,
699 numeric_precision,
700 numeric_scale
701
702 from "analyticsd".information_schema.columns
703 where table_name = 'earliest_user_events'
704
705 and table_schema = 'public'
706
707 order by ordinal_position
708
709
7102019-07-11 14:19:47,630 (Thread-2): SQL status: SELECT 4 in 0.02 seconds
7112019-07-11 14:19:47,632 (Thread-2): Writing runtime SQL for node "model.analytics_augmentation_package.earliest_user_events"
7122019-07-11 14:19:47,632 (Thread-4): SQL status: INSERT 0 743 in 0.07 seconds
7132019-07-11 14:19:47,634 (Thread-4): Using postgres connection "unified_user_id".
7142019-07-11 14:19:47,634 (Thread-4): On unified_user_id:
715
716
717create index if not exists "unified_user_id__index_on_user_id" on "analyticsd"."public"."unified_user_id" ("user_id")
718
719
720
7212019-07-11 14:19:47,634 (Thread-2): Using postgres connection "earliest_user_events".
7222019-07-11 14:19:47,634 (Thread-2): On earliest_user_events:
723
724
725 delete
726 from "analyticsd"."public"."earliest_user_events"
727 where (id) in (
728 select (id)
729 from "earliest_user_events__dbt_tmp20190711141947408781"
730 );
731
732 insert into "analyticsd"."public"."earliest_user_events" ("id", "user_id", "type", "handle_time")
733 (
734 select "id", "user_id", "type", "handle_time"
735 from "earliest_user_events__dbt_tmp20190711141947408781"
736 );
737
7382019-07-11 14:19:47,646 (Thread-4): SQL status: CREATE INDEX in 0.01 seconds
7392019-07-11 14:19:47,647 (Thread-4): Using postgres connection "unified_user_id".
7402019-07-11 14:19:47,647 (Thread-4): On unified_user_id:
741
742
743create index if not exists "unified_user_id__index_on_id" on "analyticsd"."public"."unified_user_id" ("id")
744
745
746
7472019-07-11 14:19:47,649 (Thread-2): SQL status: INSERT 0 111 in 0.01 seconds
7482019-07-11 14:19:47,650 (Thread-2): Using postgres connection "earliest_user_events".
7492019-07-11 14:19:47,651 (Thread-2): On earliest_user_events:
750
751
752create index if not exists "earliest_user_events__index_on_user_id" on "analyticsd"."public"."earliest_user_events" ("user_id")
753
754
755
7562019-07-11 14:19:47,658 (Thread-4): SQL status: CREATE INDEX in 0.01 seconds
7572019-07-11 14:19:47,660 (Thread-4): Using postgres connection "unified_user_id".
7582019-07-11 14:19:47,660 (Thread-4): On unified_user_id:
759
760
761create index if not exists "unified_user_id__index_on_handle_time" on "analyticsd"."public"."unified_user_id" ("handle_time")
762
763
764
7652019-07-11 14:19:47,662 (Thread-2): SQL status: CREATE INDEX in 0.01 seconds
7662019-07-11 14:19:47,663 (Thread-2): Using postgres connection "earliest_user_events".
7672019-07-11 14:19:47,663 (Thread-2): On earliest_user_events:
768
769
770create index if not exists "earliest_user_events__index_on_type" on "analyticsd"."public"."earliest_user_events" ("type")
771
772
773
7742019-07-11 14:19:47,672 (Thread-4): SQL status: CREATE INDEX in 0.01 seconds
7752019-07-11 14:19:47,672 (Thread-4): On unified_user_id: COMMIT
7762019-07-11 14:19:47,673 (Thread-4): Using postgres connection "unified_user_id".
7772019-07-11 14:19:47,673 (Thread-4): On unified_user_id: COMMIT
7782019-07-11 14:19:47,677 (Thread-2): SQL status: CREATE INDEX in 0.01 seconds
7792019-07-11 14:19:47,679 (Thread-2): Using postgres connection "earliest_user_events".
7802019-07-11 14:19:47,679 (Thread-2): On earliest_user_events:
781
782
783create index if not exists "earliest_user_events__index_on_handle_time" on "analyticsd"."public"."earliest_user_events" ("handle_time")
784
785
786
7872019-07-11 14:19:47,691 (Thread-4): SQL status: COMMIT in 0.02 seconds
7882019-07-11 14:19:47,697 (Thread-2): SQL status: CREATE INDEX in 0.02 seconds
7892019-07-11 14:19:47,697 (Thread-2): On earliest_user_events: COMMIT
7902019-07-11 14:19:47,698 (Thread-4): Sending event: {'category': 'dbt', 'action': 'run_model', 'label': '4c16feef-65df-4593-b169-351e0833a654', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x00000217D2392A58>]}
7912019-07-11 14:19:47,698 (Thread-2): Using postgres connection "earliest_user_events".
7922019-07-11 14:19:47,699 (Thread-2): On earliest_user_events: COMMIT
7932019-07-11 14:19:47,715 (Thread-2): SQL status: COMMIT in 0.02 seconds
7942019-07-11 14:19:47,720 (Thread-2): Sending event: {'category': 'dbt', 'action': 'run_model', 'label': '4c16feef-65df-4593-b169-351e0833a654', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x00000217D22B5EB8>]}
7952019-07-11 14:19:48,016 (Thread-4): 14:19:48 | 4 of 4 OK created incremental model public.unified_user_id........... [INSERT 0 743 in 0.49s]
7962019-07-11 14:19:48,041 (Thread-1): SQL status: SELECT 19 in 0.59 seconds
7972019-07-11 14:19:48,052 (Thread-1): Using postgres connection "anonymous_user_mappings".
7982019-07-11 14:19:48,052 (Thread-1): On anonymous_user_mappings:
799 select
800 column_name,
801 data_type,
802 character_maximum_length,
803 numeric_precision,
804 numeric_scale
805
806 from information_schema.columns
807 where table_name = 'anonymous_user_mappings__dbt_tmp20190711141947397833'
808
809 order by ordinal_position
810
811
8122019-07-11 14:19:48,073 (Thread-1): SQL status: SELECT 2 in 0.02 seconds
8132019-07-11 14:19:48,080 (Thread-1): Using postgres connection "anonymous_user_mappings".
8142019-07-11 14:19:48,080 (Thread-1): On anonymous_user_mappings:
815 select
816 column_name,
817 data_type,
818 character_maximum_length,
819 numeric_precision,
820 numeric_scale
821
822 from "analyticsd".information_schema.columns
823 where table_name = 'anonymous_user_mappings'
824
825 and table_schema = 'public'
826
827 order by ordinal_position
828
829
8302019-07-11 14:19:48,093 (Thread-1): SQL status: SELECT 2 in 0.01 seconds
8312019-07-11 14:19:48,099 (Thread-1): Using postgres connection "anonymous_user_mappings".
8322019-07-11 14:19:48,099 (Thread-1): On anonymous_user_mappings:
833 select
834 column_name,
835 data_type,
836 character_maximum_length,
837 numeric_precision,
838 numeric_scale
839
840 from "analyticsd".information_schema.columns
841 where table_name = 'anonymous_user_mappings'
842
843 and table_schema = 'public'
844
845 order by ordinal_position
846
847
8482019-07-11 14:19:48,111 (Thread-1): SQL status: SELECT 2 in 0.01 seconds
8492019-07-11 14:19:48,115 (Thread-1): Writing runtime SQL for node "model.analytics_augmentation_package.anonymous_user_mappings"
8502019-07-11 14:19:48,116 (Thread-1): Using postgres connection "anonymous_user_mappings".
8512019-07-11 14:19:48,116 (Thread-1): On anonymous_user_mappings:
852
853
854 delete
855 from "analyticsd"."public"."anonymous_user_mappings"
856 where (an_id) in (
857 select (an_id)
858 from "anonymous_user_mappings__dbt_tmp20190711141947397833"
859 );
860
861 insert into "analyticsd"."public"."anonymous_user_mappings" ("an_id", "user_id")
862 (
863 select "an_id", "user_id"
864 from "anonymous_user_mappings__dbt_tmp20190711141947397833"
865 );
866
8672019-07-11 14:19:48,131 (Thread-1): SQL status: INSERT 0 19 in 0.01 seconds
8682019-07-11 14:19:48,134 (Thread-1): Using postgres connection "anonymous_user_mappings".
8692019-07-11 14:19:48,135 (Thread-1): On anonymous_user_mappings:
870
871
872create index if not exists "anonymous_user_mappings__index_on_user_id" on "analyticsd"."public"."anonymous_user_mappings" ("user_id")
873
874
875
8762019-07-11 14:19:48,148 (Thread-1): SQL status: CREATE INDEX in 0.01 seconds
8772019-07-11 14:19:48,150 (Thread-1): Using postgres connection "anonymous_user_mappings".
8782019-07-11 14:19:48,150 (Thread-1): On anonymous_user_mappings:
879
880
881create index if not exists "anonymous_user_mappings__index_on_an_id" on "analyticsd"."public"."anonymous_user_mappings" ("an_id")
882
883
884
8852019-07-11 14:19:48,161 (Thread-1): SQL status: CREATE INDEX in 0.01 seconds
8862019-07-11 14:19:48,162 (Thread-1): On anonymous_user_mappings: COMMIT
8872019-07-11 14:19:48,162 (Thread-1): Using postgres connection "anonymous_user_mappings".
8882019-07-11 14:19:48,163 (Thread-1): On anonymous_user_mappings: COMMIT
8892019-07-11 14:19:48,174 (Thread-1): SQL status: COMMIT in 0.01 seconds
8902019-07-11 14:19:48,179 (Thread-1): Sending event: {'category': 'dbt', 'action': 'run_model', 'label': '4c16feef-65df-4593-b169-351e0833a654', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x00000217D23682B0>]}
8912019-07-11 14:19:48,367 (Thread-2): 14:19:48 | 2 of 4 OK created incremental model public.earliest_user_events...... [INSERT 0 111 in 0.51s]
8922019-07-11 14:19:48,699 (Thread-1): 14:19:48 | 1 of 4 OK created incremental model public.anonymous_user_mappings... [INSERT 0 19 in 0.97s]
8932019-07-11 14:19:49,697 (Thread-3): SQL status: SELECT 57 in 2.26 seconds
8942019-07-11 14:19:49,712 (Thread-3): Using postgres connection "real_time_utm".
8952019-07-11 14:19:49,713 (Thread-3): On real_time_utm:
896 select
897 column_name,
898 data_type,
899 character_maximum_length,
900 numeric_precision,
901 numeric_scale
902
903 from information_schema.columns
904 where table_name = 'real_time_utm__dbt_tmp20190711141947380821'
905
906 order by ordinal_position
907
908
9092019-07-11 14:19:49,732 (Thread-3): SQL status: SELECT 6 in 0.02 seconds
9102019-07-11 14:19:49,735 (Thread-3): Using postgres connection "real_time_utm".
9112019-07-11 14:19:49,735 (Thread-3): On real_time_utm:
912 select
913 column_name,
914 data_type,
915 character_maximum_length,
916 numeric_precision,
917 numeric_scale
918
919 from "analyticsd".information_schema.columns
920 where table_name = 'real_time_utm'
921
922 and table_schema = 'public'
923
924 order by ordinal_position
925
926
9272019-07-11 14:19:49,749 (Thread-3): SQL status: SELECT 6 in 0.01 seconds
9282019-07-11 14:19:49,753 (Thread-3): Using postgres connection "real_time_utm".
9292019-07-11 14:19:49,753 (Thread-3): On real_time_utm:
930 select
931 column_name,
932 data_type,
933 character_maximum_length,
934 numeric_precision,
935 numeric_scale
936
937 from "analyticsd".information_schema.columns
938 where table_name = 'real_time_utm'
939
940 and table_schema = 'public'
941
942 order by ordinal_position
943
944
9452019-07-11 14:19:49,768 (Thread-3): SQL status: SELECT 6 in 0.02 seconds
9462019-07-11 14:19:49,770 (Thread-3): Writing runtime SQL for node "model.analytics_augmentation_package.real_time_utm"
9472019-07-11 14:19:49,771 (Thread-3): Using postgres connection "real_time_utm".
9482019-07-11 14:19:49,771 (Thread-3): On real_time_utm:
949
950
951 delete
952 from "analyticsd"."public"."real_time_utm"
953 where (id) in (
954 select (id)
955 from "real_time_utm__dbt_tmp20190711141947380821"
956 );
957
958 insert into "analyticsd"."public"."real_time_utm" ("id", "utm_source", "utm_campaign", "utm_medium", "utm_content", "handle_time")
959 (
960 select "id", "utm_source", "utm_campaign", "utm_medium", "utm_content", "handle_time"
961 from "real_time_utm__dbt_tmp20190711141947380821"
962 );
963
9642019-07-11 14:19:49,784 (Thread-3): SQL status: INSERT 0 57 in 0.01 seconds
9652019-07-11 14:19:49,786 (Thread-3): Using postgres connection "real_time_utm".
9662019-07-11 14:19:49,786 (Thread-3): On real_time_utm:
967
968
969create index if not exists "real_time_utm__index_on_id" on "analyticsd"."public"."real_time_utm" ("id")
970
971
972
9732019-07-11 14:19:49,800 (Thread-3): SQL status: CREATE INDEX in 0.01 seconds
9742019-07-11 14:19:49,802 (Thread-3): Using postgres connection "real_time_utm".
9752019-07-11 14:19:49,802 (Thread-3): On real_time_utm:
976
977
978create index if not exists "real_time_utm__index_on_handle_time" on "analyticsd"."public"."real_time_utm" ("handle_time")
979
980
981
9822019-07-11 14:19:49,815 (Thread-3): SQL status: CREATE INDEX in 0.01 seconds
9832019-07-11 14:19:49,816 (Thread-3): On real_time_utm: COMMIT
9842019-07-11 14:19:49,816 (Thread-3): Using postgres connection "real_time_utm".
9852019-07-11 14:19:49,816 (Thread-3): On real_time_utm: COMMIT
9862019-07-11 14:19:49,829 (Thread-3): SQL status: COMMIT in 0.01 seconds
9872019-07-11 14:19:49,834 (Thread-3): Sending event: {'category': 'dbt', 'action': 'run_model', 'label': '4c16feef-65df-4593-b169-351e0833a654', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x00000217D2409F60>]}
9882019-07-11 14:19:50,185 (Thread-3): 14:19:50 | 3 of 4 OK created incremental model public.real_time_utm............. [INSERT 0 57 in 2.62s]
9892019-07-11 14:19:50,238 (MainThread): Using postgres connection "master".
9902019-07-11 14:19:50,238 (MainThread): On master: BEGIN
9912019-07-11 14:19:50,253 (MainThread): SQL status: BEGIN in 0.01 seconds
9922019-07-11 14:19:50,254 (MainThread): On master: COMMIT
9932019-07-11 14:19:50,254 (MainThread): Using postgres connection "master".
9942019-07-11 14:19:50,254 (MainThread): On master: COMMIT
9952019-07-11 14:19:50,266 (MainThread): SQL status: COMMIT in 0.01 seconds
9962019-07-11 14:19:50,266 (MainThread): 14:19:50 |
9972019-07-11 14:19:50,267 (MainThread): 14:19:50 | Finished running 4 incremental models in 3.51s.
9982019-07-11 14:19:50,268 (MainThread): Connection 'master' was left open.
9992019-07-11 14:19:50,268 (MainThread): On master: Close
10002019-07-11 14:19:50,269 (MainThread): Connection 'anonymous_user_mappings' was left open.
10012019-07-11 14:19:50,269 (MainThread): On anonymous_user_mappings: Close
10022019-07-11 14:19:50,269 (MainThread): Connection 'earliest_user_events' was left open.
10032019-07-11 14:19:50,270 (MainThread): On earliest_user_events: Close
10042019-07-11 14:19:50,270 (MainThread): Connection 'real_time_utm' was left open.
10052019-07-11 14:19:50,270 (MainThread): On real_time_utm: Close
10062019-07-11 14:19:50,270 (MainThread): Connection 'unified_user_id' was left open.
10072019-07-11 14:19:50,270 (MainThread): On unified_user_id: Close
10082019-07-11 14:19:50,277 (MainThread):
10092019-07-11 14:19:50,277 (MainThread): Completed successfully
10102019-07-11 14:19:50,278 (MainThread):
1011Done. PASS=4 ERROR=0 SKIP=0 TOTAL=4
10122019-07-11 14:19:50,279 (MainThread): Sending event: {'category': 'dbt', 'action': 'invocation', 'label': 'end', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x00000217D0F70FD0>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x00000217D23F9BE0>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x00000217D23F9978>]}
10132019-07-11 14:19:50,607 (MainThread): Flushing usage events