{"diffoscope-json-version": 1, "source1": "/input1", "source2": "/input2", "unified_diff": null, "details": [{"source1": "zipinfo -v {}", "source2": "zipinfo -v {}", "unified_diff": "@@ -1376,15 +1376,15 @@\n   minimum software version required to extract:   2.0\n   compression method:                             none (stored)\n   file security status:                           not encrypted\n   extended local header:                          yes\n   file last modified on (DOS date/time):          2098 Jan 1 00:00:00\n   file last modified on (UT extra field modtime): 1970 Jan 1 00:00:00 local\n   file last modified on (UT extra field modtime): 1970 Jan 1 00:00:00 UTC\n-  32-bit CRC value (hex):                         5641aac8\n+  32-bit CRC value (hex):                         c1fbf07e\n   compressed size:                                47291 bytes\n   uncompressed size:                              47291 bytes\n   length of filename:                             47 characters\n   length of extra field:                          9 bytes\n   length of file comment:                         0 characters\n   disk number on which file begins:               disk 1\n   apparent file type:                             binary\n@@ -1448,15 +1448,15 @@\n   minimum software version required to extract:   2.0\n   compression method:                             none (stored)\n   file security status:                           not encrypted\n   extended local header:                          yes\n   file last modified on (DOS date/time):          2098 Jan 1 00:00:00\n   file last modified on (UT extra field modtime): 1970 Jan 1 00:00:00 local\n   file last modified on (UT extra field modtime): 1970 Jan 1 00:00:00 UTC\n-  32-bit CRC value (hex):                         aaf7519c\n+  32-bit CRC value (hex):                         dce14b33\n   compressed size:                                29830 bytes\n   uncompressed size:                              29830 bytes\n   length of filename:                             46 characters\n   length of extra field:                          9 bytes\n   length of file comment:                         0 characters\n   disk number on which file begins:               disk 1\n   apparent file type:                             binary\n@@ -1556,15 +1556,15 @@\n   minimum software version required to extract:   2.0\n   compression method:                             none (stored)\n   file security status:                           not encrypted\n   extended local header:                          yes\n   file last modified on (DOS date/time):          2098 Jan 1 00:00:00\n   file last modified on (UT extra field modtime): 1970 Jan 1 00:00:00 local\n   file last modified on (UT extra field modtime): 1970 Jan 1 00:00:00 UTC\n-  32-bit CRC value (hex):                         ffbe84c8\n+  32-bit CRC value (hex):                         a8e073c6\n   compressed size:                                28998 bytes\n   uncompressed size:                              28998 bytes\n   length of filename:                             54 characters\n   length of extra field:                          9 bytes\n   length of file comment:                         0 characters\n   disk number on which file begins:               disk 1\n   apparent file type:                             binary\n@@ -1592,15 +1592,15 @@\n   minimum software version required to extract:   2.0\n   compression method:                             none (stored)\n   file security status:                           not encrypted\n   extended local header:                          yes\n   file last modified on (DOS date/time):          2098 Jan 1 00:00:00\n   file last modified on (UT extra field modtime): 1970 Jan 1 00:00:00 local\n   file last modified on (UT extra field modtime): 1970 Jan 1 00:00:00 UTC\n-  32-bit CRC value (hex):                         2300e66b\n+  32-bit CRC value (hex):                         38065db0\n   compressed size:                                381693 bytes\n   uncompressed size:                              381693 bytes\n   length of filename:                             70 characters\n   length of extra field:                          9 bytes\n   length of file comment:                         0 characters\n   disk number on which file begins:               disk 1\n   apparent file type:                             binary\n@@ -1628,15 +1628,15 @@\n   minimum software version required to extract:   2.0\n   compression method:                             none (stored)\n   file security status:                           not encrypted\n   extended local header:                          yes\n   file last modified on (DOS date/time):          2098 Jan 1 00:00:00\n   file last modified on (UT extra field modtime): 1970 Jan 1 00:00:00 local\n   file last modified on (UT extra field modtime): 1970 Jan 1 00:00:00 UTC\n-  32-bit CRC value (hex):                         ae956486\n+  32-bit CRC value (hex):                         080226f5\n   compressed size:                                383110 bytes\n   uncompressed size:                              383110 bytes\n   length of filename:                             62 characters\n   length of extra field:                          9 bytes\n   length of file comment:                         0 characters\n   disk number on which file begins:               disk 1\n   apparent file type:                             binary\n@@ -1700,15 +1700,15 @@\n   minimum software version required to extract:   2.0\n   compression method:                             none (stored)\n   file security status:                           not encrypted\n   extended local header:                          yes\n   file last modified on (DOS date/time):          2098 Jan 1 00:00:00\n   file last modified on (UT extra field modtime): 1970 Jan 1 00:00:00 local\n   file last modified on (UT extra field modtime): 1970 Jan 1 00:00:00 UTC\n-  32-bit CRC value (hex):                         9cd9501f\n+  32-bit CRC value (hex):                         e9aa6133\n   compressed size:                                2061814 bytes\n   uncompressed size:                              2061814 bytes\n   length of filename:                             47 characters\n   length of extra field:                          9 bytes\n   length of file comment:                         0 characters\n   disk number on which file begins:               disk 1\n   apparent file type:                             binary\n@@ -1736,15 +1736,15 @@\n   minimum software version required to extract:   2.0\n   compression method:                             none (stored)\n   file security status:                           not encrypted\n   extended local header:                          yes\n   file last modified on (DOS date/time):          2098 Jan 1 00:00:00\n   file last modified on (UT extra field modtime): 1970 Jan 1 00:00:00 local\n   file last modified on (UT extra field modtime): 1970 Jan 1 00:00:00 UTC\n-  32-bit CRC value (hex):                         2ee11aa4\n+  32-bit CRC value (hex):                         40b6bff3\n   compressed size:                                510063 bytes\n   uncompressed size:                              510063 bytes\n   length of filename:                             65 characters\n   length of extra field:                          9 bytes\n   length of file comment:                         0 characters\n   disk number on which file begins:               disk 1\n   apparent file type:                             binary\n@@ -1772,15 +1772,15 @@\n   minimum software version required to extract:   2.0\n   compression method:                             none (stored)\n   file security status:                           not encrypted\n   extended local header:                          yes\n   file last modified on (DOS date/time):          2098 Jan 1 00:00:00\n   file last modified on (UT extra field modtime): 1970 Jan 1 00:00:00 local\n   file last modified on (UT extra field modtime): 1970 Jan 1 00:00:00 UTC\n-  32-bit CRC value (hex):                         de96f8a1\n+  32-bit CRC value (hex):                         2c9c4261\n   compressed size:                                149835 bytes\n   uncompressed size:                              149835 bytes\n   length of filename:                             63 characters\n   length of extra field:                          9 bytes\n   length of file comment:                         0 characters\n   disk number on which file begins:               disk 1\n   apparent file type:                             binary\n@@ -1808,15 +1808,15 @@\n   minimum software version required to extract:   2.0\n   compression method:                             none (stored)\n   file security status:                           not encrypted\n   extended local header:                          yes\n   file last modified on (DOS date/time):          2098 Jan 1 00:00:00\n   file last modified on (UT extra field modtime): 1970 Jan 1 00:00:00 local\n   file last modified on (UT extra field modtime): 1970 Jan 1 00:00:00 UTC\n-  32-bit CRC value (hex):                         f1f84e71\n+  32-bit CRC value (hex):                         f055da0c\n   compressed size:                                667306 bytes\n   uncompressed size:                              667306 bytes\n   length of filename:                             64 characters\n   length of extra field:                          9 bytes\n   length of file comment:                         0 characters\n   disk number on which file begins:               disk 1\n   apparent file type:                             binary\n@@ -1844,15 +1844,15 @@\n   minimum software version required to extract:   2.0\n   compression method:                             none (stored)\n   file security status:                           not encrypted\n   extended local header:                          yes\n   file last modified on (DOS date/time):          2098 Jan 1 00:00:00\n   file last modified on (UT extra field modtime): 1970 Jan 1 00:00:00 local\n   file last modified on (UT extra field modtime): 1970 Jan 1 00:00:00 UTC\n-  32-bit CRC value (hex):                         e2deea1a\n+  32-bit CRC value (hex):                         039fcb02\n   compressed size:                                1414445 bytes\n   uncompressed size:                              1414445 bytes\n   length of filename:                             65 characters\n   length of extra field:                          9 bytes\n   length of file comment:                         0 characters\n   disk number on which file begins:               disk 1\n   apparent file type:                             binary\n@@ -1880,15 +1880,15 @@\n   minimum software version required to extract:   2.0\n   compression method:                             none (stored)\n   file security status:                           not encrypted\n   extended local header:                          yes\n   file last modified on (DOS date/time):          2098 Jan 1 00:00:00\n   file last modified on (UT extra field modtime): 1970 Jan 1 00:00:00 local\n   file last modified on (UT extra field modtime): 1970 Jan 1 00:00:00 UTC\n-  32-bit CRC value (hex):                         035efbc0\n+  32-bit CRC value (hex):                         5a0ac22d\n   compressed size:                                56176 bytes\n   uncompressed size:                              56176 bytes\n   length of filename:                             70 characters\n   length of extra field:                          9 bytes\n   length of file comment:                         0 characters\n   disk number on which file begins:               disk 1\n   apparent file type:                             binary\n@@ -1916,15 +1916,15 @@\n   minimum software version required to extract:   2.0\n   compression method:                             none (stored)\n   file security status:                           not encrypted\n   extended local header:                          yes\n   file last modified on (DOS date/time):          2098 Jan 1 00:00:00\n   file last modified on (UT extra field modtime): 1970 Jan 1 00:00:00 local\n   file last modified on (UT extra field modtime): 1970 Jan 1 00:00:00 UTC\n-  32-bit CRC value (hex):                         ce3bf434\n+  32-bit CRC value (hex):                         d06c381a\n   compressed size:                                286926 bytes\n   uncompressed size:                              286926 bytes\n   length of filename:                             66 characters\n   length of extra field:                          9 bytes\n   length of file comment:                         0 characters\n   disk number on which file begins:               disk 1\n   apparent file type:                             binary\n@@ -1952,15 +1952,15 @@\n   minimum software version required to extract:   2.0\n   compression method:                             none (stored)\n   file security status:                           not encrypted\n   extended local header:                          yes\n   file last modified on (DOS date/time):          2098 Jan 1 00:00:00\n   file last modified on (UT extra field modtime): 1970 Jan 1 00:00:00 local\n   file last modified on (UT extra field modtime): 1970 Jan 1 00:00:00 UTC\n-  32-bit CRC value (hex):                         3b65c515\n+  32-bit CRC value (hex):                         0662a46c\n   compressed size:                                286957 bytes\n   uncompressed size:                              286957 bytes\n   length of filename:                             66 characters\n   length of extra field:                          9 bytes\n   length of file comment:                         0 characters\n   disk number on which file begins:               disk 1\n   apparent file type:                             binary\n"}, {"source1": "zipdetails --redact --scan --utc {}", "source2": "zipdetails --redact --scan --utc {}", "unified_diff": "@@ -1013,15 +1013,15 @@\n 046A75 Extra ID #1           5455 (21589) 'Extended Timestamp [UT]'\n 046A77   Length              0005 (5)\n 046A79   Flags               01 (1) 'Modification'\n 046A7A   Modification Time   00000000 (0) 'Thu Jan  1 00:00:00 1970'\n 046A7E PAYLOAD\n \n 052339 DATA DESCRIPTOR       08074B50 (134695760)\n-05233D CRC                   5641AAC8 (1447144136)\n+05233D CRC                   C1FBF07E (3254513790)\n 052341 Compressed Size       0000B8BB (47291)\n 052345 Uncompressed Size     0000B8BB (47291)\n \n 052349 LOCAL HEADER #40      04034B50 (67324752)\n 05234D Extract Zip Spec      14 (20) '2.0'\n 05234E Extract OS            00 (0) 'MS-DOS'\n 05234F General Purpose Flag  0008 (8)\n@@ -1069,15 +1069,15 @@\n 0528FA Extra ID #1           5455 (21589) 'Extended Timestamp [UT]'\n 0528FC   Length              0005 (5)\n 0528FE   Flags               01 (1) 'Modification'\n 0528FF   Modification Time   00000000 (0) 'Thu Jan  1 00:00:00 1970'\n 052903 PAYLOAD\n \n 059D89 DATA DESCRIPTOR       08074B50 (134695760)\n-059D8D CRC                   AAF7519C (2868335004)\n+059D8D CRC                   DCE14B33 (3705752371)\n 059D91 Compressed Size       00007486 (29830)\n 059D95 Uncompressed Size     00007486 (29830)\n \n 059D99 LOCAL HEADER #42      04034B50 (67324752)\n 059D9D Extract Zip Spec      14 (20) '2.0'\n 059D9E Extract OS            00 (0) 'MS-DOS'\n 059D9F General Purpose Flag  0008 (8)\n@@ -1153,15 +1153,15 @@\n 05D2C2 Extra ID #1           5455 (21589) 'Extended Timestamp [UT]'\n 05D2C4   Length              0005 (5)\n 05D2C6   Flags               01 (1) 'Modification'\n 05D2C7   Modification Time   00000000 (0) 'Thu Jan  1 00:00:00 1970'\n 05D2CB PAYLOAD\n \n 064411 DATA DESCRIPTOR       08074B50 (134695760)\n-064415 CRC                   FFBE84C8 (4290675912)\n+064415 CRC                   A8E073C6 (2833281990)\n 064419 Compressed Size       00007146 (28998)\n 06441D Uncompressed Size     00007146 (28998)\n \n 064421 LOCAL HEADER #45      04034B50 (67324752)\n 064425 Extract Zip Spec      14 (20) '2.0'\n 064426 Extract OS            00 (0) 'MS-DOS'\n 064427 General Purpose Flag  0008 (8)\n@@ -1181,15 +1181,15 @@\n 064485 Extra ID #1           5455 (21589) 'Extended Timestamp [UT]'\n 064487   Length              0005 (5)\n 064489   Flags               01 (1) 'Modification'\n 06448A   Modification Time   00000000 (0) 'Thu Jan  1 00:00:00 1970'\n 06448E PAYLOAD\n \n 0C178B DATA DESCRIPTOR       08074B50 (134695760)\n-0C178F CRC                   2300E66B (587261547)\n+0C178F CRC                   38065DB0 (939941296)\n 0C1793 Compressed Size       0005D2FD (381693)\n 0C1797 Uncompressed Size     0005D2FD (381693)\n \n 0C179B LOCAL HEADER #46      04034B50 (67324752)\n 0C179F Extract Zip Spec      14 (20) '2.0'\n 0C17A0 Extract OS            00 (0) 'MS-DOS'\n 0C17A1 General Purpose Flag  0008 (8)\n@@ -1209,15 +1209,15 @@\n 0C17F7 Extra ID #1           5455 (21589) 'Extended Timestamp [UT]'\n 0C17F9   Length              0005 (5)\n 0C17FB   Flags               01 (1) 'Modification'\n 0C17FC   Modification Time   00000000 (0) 'Thu Jan  1 00:00:00 1970'\n 0C1800 PAYLOAD\n \n 11F086 DATA DESCRIPTOR       08074B50 (134695760)\n-11F08A CRC                   AE956486 (2929026182)\n+11F08A CRC                   080226F5 (134358773)\n 11F08E Compressed Size       0005D886 (383110)\n 11F092 Uncompressed Size     0005D886 (383110)\n \n 11F096 LOCAL HEADER #47      04034B50 (67324752)\n 11F09A Extract Zip Spec      14 (20) '2.0'\n 11F09B Extract OS            00 (0) 'MS-DOS'\n 11F09C General Purpose Flag  0008 (8)\n@@ -1265,15 +1265,15 @@\n 13538F Extra ID #1           5455 (21589) 'Extended Timestamp [UT]'\n 135391   Length              0005 (5)\n 135393   Flags               01 (1) 'Modification'\n 135394   Modification Time   00000000 (0) 'Thu Jan  1 00:00:00 1970'\n 135398 PAYLOAD\n \n 32C98E DATA DESCRIPTOR       08074B50 (134695760)\n-32C992 CRC                   9CD9501F (2631487519)\n+32C992 CRC                   E9AA6133 (3920257331)\n 32C996 Compressed Size       001F75F6 (2061814)\n 32C99A Uncompressed Size     001F75F6 (2061814)\n \n 32C99E LOCAL HEADER #49      04034B50 (67324752)\n 32C9A2 Extract Zip Spec      14 (20) '2.0'\n 32C9A3 Extract OS            00 (0) 'MS-DOS'\n 32C9A4 General Purpose Flag  0008 (8)\n@@ -1293,15 +1293,15 @@\n 32C9FD Extra ID #1           5455 (21589) 'Extended Timestamp [UT]'\n 32C9FF   Length              0005 (5)\n 32CA01   Flags               01 (1) 'Modification'\n 32CA02   Modification Time   00000000 (0) 'Thu Jan  1 00:00:00 1970'\n 32CA06 PAYLOAD\n \n 3A9275 DATA DESCRIPTOR       08074B50 (134695760)\n-3A9279 CRC                   2EE11AA4 (786504356)\n+3A9279 CRC                   40B6BFF3 (1085718515)\n 3A927D Compressed Size       0007C86F (510063)\n 3A9281 Uncompressed Size     0007C86F (510063)\n \n 3A9285 LOCAL HEADER #50      04034B50 (67324752)\n 3A9289 Extract Zip Spec      14 (20) '2.0'\n 3A928A Extract OS            00 (0) 'MS-DOS'\n 3A928B General Purpose Flag  0008 (8)\n@@ -1321,15 +1321,15 @@\n 3A92E2 Extra ID #1           5455 (21589) 'Extended Timestamp [UT]'\n 3A92E4   Length              0005 (5)\n 3A92E6   Flags               01 (1) 'Modification'\n 3A92E7   Modification Time   00000000 (0) 'Thu Jan  1 00:00:00 1970'\n 3A92EB PAYLOAD\n \n 3CDC36 DATA DESCRIPTOR       08074B50 (134695760)\n-3CDC3A CRC                   DE96F8A1 (3734436001)\n+3CDC3A CRC                   2C9C4261 (748438113)\n 3CDC3E Compressed Size       0002494B (149835)\n 3CDC42 Uncompressed Size     0002494B (149835)\n \n 3CDC46 LOCAL HEADER #51      04034B50 (67324752)\n 3CDC4A Extract Zip Spec      14 (20) '2.0'\n 3CDC4B Extract OS            00 (0) 'MS-DOS'\n 3CDC4C General Purpose Flag  0008 (8)\n@@ -1349,15 +1349,15 @@\n 3CDCA4 Extra ID #1           5455 (21589) 'Extended Timestamp [UT]'\n 3CDCA6   Length              0005 (5)\n 3CDCA8   Flags               01 (1) 'Modification'\n 3CDCA9   Modification Time   00000000 (0) 'Thu Jan  1 00:00:00 1970'\n 3CDCAD PAYLOAD\n \n 470B57 DATA DESCRIPTOR       08074B50 (134695760)\n-470B5B CRC                   F1F84E71 (4059582065)\n+470B5B CRC                   F055DA0C (4032158220)\n 470B5F Compressed Size       000A2EAA (667306)\n 470B63 Uncompressed Size     000A2EAA (667306)\n \n 470B67 LOCAL HEADER #52      04034B50 (67324752)\n 470B6B Extract Zip Spec      14 (20) '2.0'\n 470B6C Extract OS            00 (0) 'MS-DOS'\n 470B6D General Purpose Flag  0008 (8)\n@@ -1377,15 +1377,15 @@\n 470BC6 Extra ID #1           5455 (21589) 'Extended Timestamp [UT]'\n 470BC8   Length              0005 (5)\n 470BCA   Flags               01 (1) 'Modification'\n 470BCB   Modification Time   00000000 (0) 'Thu Jan  1 00:00:00 1970'\n 470BCF PAYLOAD\n \n 5CA0FC DATA DESCRIPTOR       08074B50 (134695760)\n-5CA100 CRC                   E2DEEA1A (3806259738)\n+5CA100 CRC                   039FCB02 (60803842)\n 5CA104 Compressed Size       0015952D (1414445)\n 5CA108 Uncompressed Size     0015952D (1414445)\n \n 5CA10C LOCAL HEADER #53      04034B50 (67324752)\n 5CA110 Extract Zip Spec      14 (20) '2.0'\n 5CA111 Extract OS            00 (0) 'MS-DOS'\n 5CA112 General Purpose Flag  0008 (8)\n@@ -1405,15 +1405,15 @@\n 5CA170 Extra ID #1           5455 (21589) 'Extended Timestamp [UT]'\n 5CA172   Length              0005 (5)\n 5CA174   Flags               01 (1) 'Modification'\n 5CA175   Modification Time   00000000 (0) 'Thu Jan  1 00:00:00 1970'\n 5CA179 PAYLOAD\n \n 5D7CE9 DATA DESCRIPTOR       08074B50 (134695760)\n-5D7CED CRC                   035EFBC0 (56556480)\n+5D7CED CRC                   5A0AC22D (1510654509)\n 5D7CF1 Compressed Size       0000DB70 (56176)\n 5D7CF5 Uncompressed Size     0000DB70 (56176)\n \n 5D7CF9 LOCAL HEADER #54      04034B50 (67324752)\n 5D7CFD Extract Zip Spec      14 (20) '2.0'\n 5D7CFE Extract OS            00 (0) 'MS-DOS'\n 5D7CFF General Purpose Flag  0008 (8)\n@@ -1433,15 +1433,15 @@\n 5D7D59 Extra ID #1           5455 (21589) 'Extended Timestamp [UT]'\n 5D7D5B   Length              0005 (5)\n 5D7D5D   Flags               01 (1) 'Modification'\n 5D7D5E   Modification Time   00000000 (0) 'Thu Jan  1 00:00:00 1970'\n 5D7D62 PAYLOAD\n \n 61DE30 DATA DESCRIPTOR       08074B50 (134695760)\n-61DE34 CRC                   CE3BF434 (3460035636)\n+61DE34 CRC                   D06C381A (3496753178)\n 61DE38 Compressed Size       000460CE (286926)\n 61DE3C Uncompressed Size     000460CE (286926)\n \n 61DE40 LOCAL HEADER #55      04034B50 (67324752)\n 61DE44 Extract Zip Spec      14 (20) '2.0'\n 61DE45 Extract OS            00 (0) 'MS-DOS'\n 61DE46 General Purpose Flag  0008 (8)\n@@ -1461,15 +1461,15 @@\n 61DEA0 Extra ID #1           5455 (21589) 'Extended Timestamp [UT]'\n 61DEA2   Length              0005 (5)\n 61DEA4   Flags               01 (1) 'Modification'\n 61DEA5   Modification Time   00000000 (0) 'Thu Jan  1 00:00:00 1970'\n 61DEA9 PAYLOAD\n \n 663F96 DATA DESCRIPTOR       08074B50 (134695760)\n-663F9A CRC                   3B65C515 (996525333)\n+663F9A CRC                   0662A46C (107127916)\n 663F9E Compressed Size       000460ED (286957)\n 663FA2 Uncompressed Size     000460ED (286957)\n \n 663FA6 LOCAL HEADER #56      04034B50 (67324752)\n 663FAA Extract Zip Spec      14 (20) '2.0'\n 663FAB Extract OS            00 (0) 'MS-DOS'\n 663FAC General Purpose Flag  0008 (8)\n@@ -2884,15 +2884,15 @@\n 675862 Created OS            00 (0) 'MS-DOS'\n 675863 Extract Zip Spec      14 (20) '2.0'\n 675864 Extract OS            00 (0) 'MS-DOS'\n 675865 General Purpose Flag  0008 (8)\n        [Bit  3]              1 'Streamed'\n 675867 Compression Method    0000 (0) 'Stored'\n 675869 Modification Time     EC210000 (3961585664) 'Wed Jan  1 00:00:00 2098'\n-67586D CRC                   5641AAC8 (1447144136)\n+67586D CRC                   C1FBF07E (3254513790)\n 675871 Compressed Size       0000B8BB (47291)\n 675875 Uncompressed Size     0000B8BB (47291)\n 675879 Filename Length       002F (47)\n 67587B Extra Length          0009 (9)\n 67587D Comment Length        0000 (0)\n 67587F Disk Start            0000 (0)\n 675881 Int File Attributes   0000 (0)\n@@ -2944,15 +2944,15 @@\n 67592A Created OS            00 (0) 'MS-DOS'\n 67592B Extract Zip Spec      14 (20) '2.0'\n 67592C Extract OS            00 (0) 'MS-DOS'\n 67592D General Purpose Flag  0008 (8)\n        [Bit  3]              1 'Streamed'\n 67592F Compression Method    0000 (0) 'Stored'\n 675931 Modification Time     EC210000 (3961585664) 'Wed Jan  1 00:00:00 2098'\n-675935 CRC                   AAF7519C (2868335004)\n+675935 CRC                   DCE14B33 (3705752371)\n 675939 Compressed Size       00007486 (29830)\n 67593D Uncompressed Size     00007486 (29830)\n 675941 Filename Length       002E (46)\n 675943 Extra Length          0009 (9)\n 675945 Comment Length        0000 (0)\n 675947 Disk Start            0000 (0)\n 675949 Int File Attributes   0000 (0)\n@@ -3034,15 +3034,15 @@\n 675A61 Created OS            00 (0) 'MS-DOS'\n 675A62 Extract Zip Spec      14 (20) '2.0'\n 675A63 Extract OS            00 (0) 'MS-DOS'\n 675A64 General Purpose Flag  0008 (8)\n        [Bit  3]              1 'Streamed'\n 675A66 Compression Method    0000 (0) 'Stored'\n 675A68 Modification Time     EC210000 (3961585664) 'Wed Jan  1 00:00:00 2098'\n-675A6C CRC                   FFBE84C8 (4290675912)\n+675A6C CRC                   A8E073C6 (2833281990)\n 675A70 Compressed Size       00007146 (28998)\n 675A74 Uncompressed Size     00007146 (28998)\n 675A78 Filename Length       0036 (54)\n 675A7A Extra Length          0009 (9)\n 675A7C Comment Length        0000 (0)\n 675A7E Disk Start            0000 (0)\n 675A80 Int File Attributes   0000 (0)\n@@ -3064,15 +3064,15 @@\n 675ACE Created OS            00 (0) 'MS-DOS'\n 675ACF Extract Zip Spec      14 (20) '2.0'\n 675AD0 Extract OS            00 (0) 'MS-DOS'\n 675AD1 General Purpose Flag  0008 (8)\n        [Bit  3]              1 'Streamed'\n 675AD3 Compression Method    0000 (0) 'Stored'\n 675AD5 Modification Time     EC210000 (3961585664) 'Wed Jan  1 00:00:00 2098'\n-675AD9 CRC                   2300E66B (587261547)\n+675AD9 CRC                   38065DB0 (939941296)\n 675ADD Compressed Size       0005D2FD (381693)\n 675AE1 Uncompressed Size     0005D2FD (381693)\n 675AE5 Filename Length       0046 (70)\n 675AE7 Extra Length          0009 (9)\n 675AE9 Comment Length        0000 (0)\n 675AEB Disk Start            0000 (0)\n 675AED Int File Attributes   0000 (0)\n@@ -3094,15 +3094,15 @@\n 675B4B Created OS            00 (0) 'MS-DOS'\n 675B4C Extract Zip Spec      14 (20) '2.0'\n 675B4D Extract OS            00 (0) 'MS-DOS'\n 675B4E General Purpose Flag  0008 (8)\n        [Bit  3]              1 'Streamed'\n 675B50 Compression Method    0000 (0) 'Stored'\n 675B52 Modification Time     EC210000 (3961585664) 'Wed Jan  1 00:00:00 2098'\n-675B56 CRC                   AE956486 (2929026182)\n+675B56 CRC                   080226F5 (134358773)\n 675B5A Compressed Size       0005D886 (383110)\n 675B5E Uncompressed Size     0005D886 (383110)\n 675B62 Filename Length       003E (62)\n 675B64 Extra Length          0009 (9)\n 675B66 Comment Length        0000 (0)\n 675B68 Disk Start            0000 (0)\n 675B6A Int File Attributes   0000 (0)\n@@ -3154,15 +3154,15 @@\n 675C23 Created OS            00 (0) 'MS-DOS'\n 675C24 Extract Zip Spec      14 (20) '2.0'\n 675C25 Extract OS            00 (0) 'MS-DOS'\n 675C26 General Purpose Flag  0008 (8)\n        [Bit  3]              1 'Streamed'\n 675C28 Compression Method    0000 (0) 'Stored'\n 675C2A Modification Time     EC210000 (3961585664) 'Wed Jan  1 00:00:00 2098'\n-675C2E CRC                   9CD9501F (2631487519)\n+675C2E CRC                   E9AA6133 (3920257331)\n 675C32 Compressed Size       001F75F6 (2061814)\n 675C36 Uncompressed Size     001F75F6 (2061814)\n 675C3A Filename Length       002F (47)\n 675C3C Extra Length          0009 (9)\n 675C3E Comment Length        0000 (0)\n 675C40 Disk Start            0000 (0)\n 675C42 Int File Attributes   0000 (0)\n@@ -3184,15 +3184,15 @@\n 675C89 Created OS            00 (0) 'MS-DOS'\n 675C8A Extract Zip Spec      14 (20) '2.0'\n 675C8B Extract OS            00 (0) 'MS-DOS'\n 675C8C General Purpose Flag  0008 (8)\n        [Bit  3]              1 'Streamed'\n 675C8E Compression Method    0000 (0) 'Stored'\n 675C90 Modification Time     EC210000 (3961585664) 'Wed Jan  1 00:00:00 2098'\n-675C94 CRC                   2EE11AA4 (786504356)\n+675C94 CRC                   40B6BFF3 (1085718515)\n 675C98 Compressed Size       0007C86F (510063)\n 675C9C Uncompressed Size     0007C86F (510063)\n 675CA0 Filename Length       0041 (65)\n 675CA2 Extra Length          0009 (9)\n 675CA4 Comment Length        0000 (0)\n 675CA6 Disk Start            0000 (0)\n 675CA8 Int File Attributes   0000 (0)\n@@ -3214,15 +3214,15 @@\n 675D01 Created OS            00 (0) 'MS-DOS'\n 675D02 Extract Zip Spec      14 (20) '2.0'\n 675D03 Extract OS            00 (0) 'MS-DOS'\n 675D04 General Purpose Flag  0008 (8)\n        [Bit  3]              1 'Streamed'\n 675D06 Compression Method    0000 (0) 'Stored'\n 675D08 Modification Time     EC210000 (3961585664) 'Wed Jan  1 00:00:00 2098'\n-675D0C CRC                   DE96F8A1 (3734436001)\n+675D0C CRC                   2C9C4261 (748438113)\n 675D10 Compressed Size       0002494B (149835)\n 675D14 Uncompressed Size     0002494B (149835)\n 675D18 Filename Length       003F (63)\n 675D1A Extra Length          0009 (9)\n 675D1C Comment Length        0000 (0)\n 675D1E Disk Start            0000 (0)\n 675D20 Int File Attributes   0000 (0)\n@@ -3244,15 +3244,15 @@\n 675D77 Created OS            00 (0) 'MS-DOS'\n 675D78 Extract Zip Spec      14 (20) '2.0'\n 675D79 Extract OS            00 (0) 'MS-DOS'\n 675D7A General Purpose Flag  0008 (8)\n        [Bit  3]              1 'Streamed'\n 675D7C Compression Method    0000 (0) 'Stored'\n 675D7E Modification Time     EC210000 (3961585664) 'Wed Jan  1 00:00:00 2098'\n-675D82 CRC                   F1F84E71 (4059582065)\n+675D82 CRC                   F055DA0C (4032158220)\n 675D86 Compressed Size       000A2EAA (667306)\n 675D8A Uncompressed Size     000A2EAA (667306)\n 675D8E Filename Length       0040 (64)\n 675D90 Extra Length          0009 (9)\n 675D92 Comment Length        0000 (0)\n 675D94 Disk Start            0000 (0)\n 675D96 Int File Attributes   0000 (0)\n@@ -3274,15 +3274,15 @@\n 675DEE Created OS            00 (0) 'MS-DOS'\n 675DEF Extract Zip Spec      14 (20) '2.0'\n 675DF0 Extract OS            00 (0) 'MS-DOS'\n 675DF1 General Purpose Flag  0008 (8)\n        [Bit  3]              1 'Streamed'\n 675DF3 Compression Method    0000 (0) 'Stored'\n 675DF5 Modification Time     EC210000 (3961585664) 'Wed Jan  1 00:00:00 2098'\n-675DF9 CRC                   E2DEEA1A (3806259738)\n+675DF9 CRC                   039FCB02 (60803842)\n 675DFD Compressed Size       0015952D (1414445)\n 675E01 Uncompressed Size     0015952D (1414445)\n 675E05 Filename Length       0041 (65)\n 675E07 Extra Length          0009 (9)\n 675E09 Comment Length        0000 (0)\n 675E0B Disk Start            0000 (0)\n 675E0D Int File Attributes   0000 (0)\n@@ -3304,15 +3304,15 @@\n 675E66 Created OS            00 (0) 'MS-DOS'\n 675E67 Extract Zip Spec      14 (20) '2.0'\n 675E68 Extract OS            00 (0) 'MS-DOS'\n 675E69 General Purpose Flag  0008 (8)\n        [Bit  3]              1 'Streamed'\n 675E6B Compression Method    0000 (0) 'Stored'\n 675E6D Modification Time     EC210000 (3961585664) 'Wed Jan  1 00:00:00 2098'\n-675E71 CRC                   035EFBC0 (56556480)\n+675E71 CRC                   5A0AC22D (1510654509)\n 675E75 Compressed Size       0000DB70 (56176)\n 675E79 Uncompressed Size     0000DB70 (56176)\n 675E7D Filename Length       0046 (70)\n 675E7F Extra Length          0009 (9)\n 675E81 Comment Length        0000 (0)\n 675E83 Disk Start            0000 (0)\n 675E85 Int File Attributes   0000 (0)\n@@ -3334,15 +3334,15 @@\n 675EE3 Created OS            00 (0) 'MS-DOS'\n 675EE4 Extract Zip Spec      14 (20) '2.0'\n 675EE5 Extract OS            00 (0) 'MS-DOS'\n 675EE6 General Purpose Flag  0008 (8)\n        [Bit  3]              1 'Streamed'\n 675EE8 Compression Method    0000 (0) 'Stored'\n 675EEA Modification Time     EC210000 (3961585664) 'Wed Jan  1 00:00:00 2098'\n-675EEE CRC                   CE3BF434 (3460035636)\n+675EEE CRC                   D06C381A (3496753178)\n 675EF2 Compressed Size       000460CE (286926)\n 675EF6 Uncompressed Size     000460CE (286926)\n 675EFA Filename Length       0042 (66)\n 675EFC Extra Length          0009 (9)\n 675EFE Comment Length        0000 (0)\n 675F00 Disk Start            0000 (0)\n 675F02 Int File Attributes   0000 (0)\n@@ -3364,15 +3364,15 @@\n 675F5C Created OS            00 (0) 'MS-DOS'\n 675F5D Extract Zip Spec      14 (20) '2.0'\n 675F5E Extract OS            00 (0) 'MS-DOS'\n 675F5F General Purpose Flag  0008 (8)\n        [Bit  3]              1 'Streamed'\n 675F61 Compression Method    0000 (0) 'Stored'\n 675F63 Modification Time     EC210000 (3961585664) 'Wed Jan  1 00:00:00 2098'\n-675F67 CRC                   3B65C515 (996525333)\n+675F67 CRC                   0662A46C (107127916)\n 675F6B Compressed Size       000460ED (286957)\n 675F6F Uncompressed Size     000460ED (286957)\n 675F73 Filename Length       0042 (66)\n 675F75 Extra Length          0009 (9)\n 675F77 Comment Length        0000 (0)\n 675F79 Disk Start            0000 (0)\n 675F7B Int File Attributes   0000 (0)\n"}, {"source1": "org/apache/hadoop/hive/ql/parse/HintParser.java", "source2": "org/apache/hadoop/hive/ql/parse/HintParser.java", "unified_diff": "@@ -1,8 +1,8 @@\n-// $ANTLR 3.5.2 org/apache/hadoop/hive/ql/parse/HintParser.g 2023-08-07 15:17:01\n+// $ANTLR 3.5.2 org/apache/hadoop/hive/ql/parse/HintParser.g 2025-01-31 11:38:47\n \n package org.apache.hadoop.hive.ql.parse;\n \n import org.apache.hadoop.conf.Configuration;\n import org.apache.hadoop.hive.conf.HiveConf;\n \n \n@@ -832,15 +832,15 @@\n \n \t\t\t\t\t}\n \t\t\t\t\tbreak;\n \n \t\t\t}\n \n \t\t\t// AST REWRITE\n-\t\t\t// elements: hintName, hintArgs\n+\t\t\t// elements: hintArgs, hintName\n \t\t\t// token labels: \n \t\t\t// rule labels: retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tretval.tree = root_0;\n \t\t\tRewriteRuleSubtreeStream stream_retval=new RewriteRuleSubtreeStream(adaptor,\"rule retval\",retval!=null?retval.getTree():null);\n"}, {"source1": "org/apache/hadoop/hive/ql/parse/HiveLexer.java", "source2": "org/apache/hadoop/hive/ql/parse/HiveLexer.java", "unified_diff": "@@ -1,8 +1,8 @@\n-// $ANTLR 3.5.2 org/apache/hadoop/hive/ql/parse/HiveLexer.g 2023-08-07 15:16:57\n+// $ANTLR 3.5.2 org/apache/hadoop/hive/ql/parse/HiveLexer.g 2025-01-31 11:38:44\n \n package org.apache.hadoop.hive.ql.parse;\n \n import org.apache.commons.lang3.StringUtils;\n \n \n import org.antlr.runtime.*;\n"}, {"source1": "org/apache/hadoop/hive/ql/parse/HiveLexerStandard.java", "source2": "org/apache/hadoop/hive/ql/parse/HiveLexerStandard.java", "unified_diff": "@@ -1,8 +1,8 @@\n-// $ANTLR 3.5.2 org/apache/hadoop/hive/ql/parse/HiveLexerStandard.g 2023-08-07 15:17:01\n+// $ANTLR 3.5.2 org/apache/hadoop/hive/ql/parse/HiveLexerStandard.g 2025-01-31 11:38:47\n \n package org.apache.hadoop.hive.ql.parse;\n \n import org.apache.commons.lang3.StringUtils;\n \n \n import org.antlr.runtime.*;\n"}, {"source1": "org/apache/hadoop/hive/ql/parse/HiveLexerStandard_HiveLexerParent.java", "source2": "org/apache/hadoop/hive/ql/parse/HiveLexerStandard_HiveLexerParent.java", "unified_diff": "@@ -1,8 +1,8 @@\n-// $ANTLR 3.5.2 HiveLexerParent.g 2023-08-07 15:17:01\n+// $ANTLR 3.5.2 HiveLexerParent.g 2025-01-31 11:38:47\n \n package org.apache.hadoop.hive.ql.parse;\n \n import org.apache.commons.lang3.StringUtils;\n \n \n import org.antlr.runtime.*;\n"}, {"source1": "org/apache/hadoop/hive/ql/parse/HiveLexer_HiveLexerParent.java", "source2": "org/apache/hadoop/hive/ql/parse/HiveLexer_HiveLexerParent.java", "unified_diff": "@@ -1,8 +1,8 @@\n-// $ANTLR 3.5.2 HiveLexerParent.g 2023-08-07 15:16:58\n+// $ANTLR 3.5.2 HiveLexerParent.g 2025-01-31 11:38:44\n \n package org.apache.hadoop.hive.ql.parse;\n \n import org.apache.commons.lang3.StringUtils;\n \n \n import org.antlr.runtime.*;\n"}, {"source1": "org/apache/hadoop/hive/ql/parse/HiveParser.java", "source2": "org/apache/hadoop/hive/ql/parse/HiveParser.java", "unified_diff": "@@ -1,8 +1,8 @@\n-// $ANTLR 3.5.2 org/apache/hadoop/hive/ql/parse/HiveParser.g 2023-08-07 15:45:12\n+// $ANTLR 3.5.2 org/apache/hadoop/hive/ql/parse/HiveParser.g 2025-01-31 11:38:45\n \n package org.apache.hadoop.hive.ql.parse;\n \n import java.util.Arrays;\n import java.util.ArrayList;\n import java.util.Collection;\n import java.util.HashMap;\n@@ -1748,15 +1748,15 @@\n \n \t\t\t\t\tpushFollow(FOLLOW_execStatement_in_explainStatement1593);\n \t\t\t\t\texecStatement7=execStatement();\n \t\t\t\t\tstate._fsp--;\n \t\t\t\t\tif (state.failed) return retval;\n \t\t\t\t\tif ( state.backtracking==0 ) stream_execStatement.add(execStatement7.getTree());\n \t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t// elements: execStatement, explainOption\n+\t\t\t\t\t// elements: explainOption, execStatement\n \t\t\t\t\t// token labels: \n \t\t\t\t\t// rule labels: retval\n \t\t\t\t\t// token list labels: \n \t\t\t\t\t// rule list labels: \n \t\t\t\t\t// wildcard labels: \n \t\t\t\t\tif ( state.backtracking==0 ) {\n \t\t\t\t\tretval.tree = root_0;\n@@ -3102,15 +3102,15 @@\n \t\t\t\t\tif ( state.backtracking==0 ) stream_inputFileFormat.add(inputFileFormat50.getTree());\n \t\t\t\t\t}\n \t\t\t\t\tbreak;\n \n \t\t\t}\n \n \t\t\t// AST REWRITE\n-\t\t\t// elements: islocal, isoverwrite, tab, path, inputFileFormat\n+\t\t\t// elements: islocal, inputFileFormat, isoverwrite, path, tab\n \t\t\t// token labels: islocal, path, isoverwrite\n \t\t\t// rule labels: tab, retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -3255,15 +3255,15 @@\n \n \t\t\t}\n \n \t\t\tRPAREN54=(Token)match(input,RPAREN,FOLLOW_RPAREN_in_replicationClause2175); if (state.failed) return retval; \n \t\t\tif ( state.backtracking==0 ) stream_RPAREN.add(RPAREN54);\n \n \t\t\t// AST REWRITE\n-\t\t\t// elements: isMetadataOnly, replId\n+\t\t\t// elements: replId, isMetadataOnly\n \t\t\t// token labels: replId, isMetadataOnly\n \t\t\t// rule labels: retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -3400,15 +3400,15 @@\n \t\t\t\t\tif ( state.backtracking==0 ) stream_replicationClause.add(replicationClause58.getTree());\n \t\t\t\t\t}\n \t\t\t\t\tbreak;\n \n \t\t\t}\n \n \t\t\t// AST REWRITE\n-\t\t\t// elements: path, replicationClause, tab\n+\t\t\t// elements: tab, replicationClause, path\n \t\t\t// token labels: path\n \t\t\t// rule labels: tab, retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -3582,15 +3582,15 @@\n \t\t\t\t\tif ( state.backtracking==0 ) stream_tableLocation.add(tableLocation62.getTree());\n \t\t\t\t\t}\n \t\t\t\t\tbreak;\n \n \t\t\t}\n \n \t\t\t// AST REWRITE\n-\t\t\t// elements: tableLocation, tab, ext, path\n+\t\t\t// elements: tab, tableLocation, ext, path\n \t\t\t// token labels: ext, path\n \t\t\t// rule labels: tab, retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -3892,15 +3892,15 @@\n \t\t\t\t\tif ( state.backtracking==0 ) stream_replTableLevelPolicy.add(tablePolicy.getTree());\n \t\t\t\t\t}\n \t\t\t\t\tbreak;\n \n \t\t\t}\n \n \t\t\t// AST REWRITE\n-\t\t\t// elements: tablePolicy, dbName\n+\t\t\t// elements: dbName, tablePolicy\n \t\t\t// token labels: \n \t\t\t// rule labels: dbName, tablePolicy, retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -4047,15 +4047,15 @@\n \t\t\t\t\tif ( state.backtracking==0 ) stream_replConfigs.add(replConf.getTree());\n \t\t\t\t\t}\n \t\t\t\t\tbreak;\n \n \t\t\t}\n \n \t\t\t// AST REWRITE\n-\t\t\t// elements: sourceDbPolicy, replConf, dbName\n+\t\t\t// elements: dbName, sourceDbPolicy, replConf\n \t\t\t// token labels: \n \t\t\t// rule labels: dbName, sourceDbPolicy, retval, replConf\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -4399,15 +4399,15 @@\n \t\t\t\t\tbreak;\n \n \t\t\t}\n \n \t\t\t}\n \n \t\t\t// AST REWRITE\n-\t\t\t// elements: replTablesExcludeList, replTablesIncludeList\n+\t\t\t// elements: replTablesIncludeList, replTablesExcludeList\n \t\t\t// token labels: replTablesExcludeList, replTablesIncludeList\n \t\t\t// rule labels: retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -4533,15 +4533,15 @@\n \t\t\t\t\tif ( state.backtracking==0 ) stream_replConfigs.add(replConf.getTree());\n \t\t\t\t\t}\n \t\t\t\t\tbreak;\n \n \t\t\t}\n \n \t\t\t// AST REWRITE\n-\t\t\t// elements: replConf, dbName\n+\t\t\t// elements: dbName, replConf\n \t\t\t// token labels: \n \t\t\t// rule labels: dbName, retval, replConf\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -6305,15 +6305,15 @@\n \t\t\t\t\t\t\tif ( state.backtracking==0 ) stream_dbProperties.add(dbprops.getTree());\n \t\t\t\t\t\t\t}\n \t\t\t\t\t\t\tbreak;\n \n \t\t\t\t\t}\n \n \t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t// elements: dbLocation, dbprops, name, databaseComment, dbManagedLocation, ifNotExists\n+\t\t\t\t\t// elements: dbManagedLocation, dbLocation, dbprops, name, databaseComment, ifNotExists\n \t\t\t\t\t// token labels: \n \t\t\t\t\t// rule labels: name, dbprops, retval\n \t\t\t\t\t// token list labels: \n \t\t\t\t\t// rule list labels: \n \t\t\t\t\t// wildcard labels: \n \t\t\t\t\tif ( state.backtracking==0 ) {\n \t\t\t\t\tretval.tree = root_0;\n@@ -6489,15 +6489,15 @@\n \t\t\t\t\t\t\tif ( state.backtracking==0 ) stream_dbProperties.add(dbprops.getTree());\n \t\t\t\t\t\t\t}\n \t\t\t\t\t\t\tbreak;\n \n \t\t\t\t\t}\n \n \t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t// elements: name, databaseComment, dbConnectorName, ifNotExists, dbprops\n+\t\t\t\t\t// elements: dbConnectorName, databaseComment, ifNotExists, dbprops, name\n \t\t\t\t\t// token labels: \n \t\t\t\t\t// rule labels: name, dbprops, retval\n \t\t\t\t\t// token list labels: \n \t\t\t\t\t// rule list labels: \n \t\t\t\t\t// wildcard labels: \n \t\t\t\t\tif ( state.backtracking==0 ) {\n \t\t\t\t\tretval.tree = root_0;\n@@ -7249,15 +7249,15 @@\n \t\t\t\t\tif ( state.backtracking==0 ) stream_restrictOrCascade.add(restrictOrCascade178.getTree());\n \t\t\t\t\t}\n \t\t\t\t\tbreak;\n \n \t\t\t}\n \n \t\t\t// AST REWRITE\n-\t\t\t// elements: ifExists, identifier, restrictOrCascade\n+\t\t\t// elements: restrictOrCascade, ifExists, identifier\n \t\t\t// token labels: \n \t\t\t// rule labels: retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -7518,15 +7518,15 @@\n \t\t\t\t\tif ( state.backtracking==0 ) stream_force.add(force187.getTree());\n \t\t\t\t\t}\n \t\t\t\t\tbreak;\n \n \t\t\t}\n \n \t\t\t// AST REWRITE\n-\t\t\t// elements: tablePartitionPrefix, columnNameList, force\n+\t\t\t// elements: columnNameList, tablePartitionPrefix, force\n \t\t\t// token labels: \n \t\t\t// rule labels: retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -7688,15 +7688,15 @@\n \t\t\t\t\tif ( state.backtracking==0 ) stream_replicationClause.add(replicationClause193.getTree());\n \t\t\t\t\t}\n \t\t\t\t\tbreak;\n \n \t\t\t}\n \n \t\t\t// AST REWRITE\n-\t\t\t// elements: tableName, ifExists, replicationClause, KW_PURGE\n+\t\t\t// elements: ifExists, KW_PURGE, tableName, replicationClause\n \t\t\t// token labels: \n \t\t\t// rule labels: retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -9356,15 +9356,15 @@\n \t\t\t\t\tdbName=identifier();\n \t\t\t\t\tstate._fsp--;\n \t\t\t\t\tif (state.failed) return retval;\n \t\t\t\t\tif ( state.backtracking==0 ) stream_identifier.add(dbName.getTree());\n \t\t\t\t\t}\n \n \t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t// elements: KW_EXTENDED, dbName\n+\t\t\t\t\t// elements: dbName, KW_EXTENDED\n \t\t\t\t\t// token labels: \n \t\t\t\t\t// rule labels: dbName, retval\n \t\t\t\t\t// token list labels: \n \t\t\t\t\t// rule list labels: \n \t\t\t\t\t// wildcard labels: \n \t\t\t\t\tif ( state.backtracking==0 ) {\n \t\t\t\t\tretval.tree = root_0;\n@@ -9432,15 +9432,15 @@\n \t\t\t\t\tdcName=identifier();\n \t\t\t\t\tstate._fsp--;\n \t\t\t\t\tif (state.failed) return retval;\n \t\t\t\t\tif ( state.backtracking==0 ) stream_identifier.add(dcName.getTree());\n \t\t\t\t\t}\n \n \t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t// elements: KW_EXTENDED, dcName\n+\t\t\t\t\t// elements: dcName, KW_EXTENDED\n \t\t\t\t\t// token labels: \n \t\t\t\t\t// rule labels: dcName, retval\n \t\t\t\t\t// token list labels: \n \t\t\t\t\t// rule list labels: \n \t\t\t\t\t// wildcard labels: \n \t\t\t\t\tif ( state.backtracking==0 ) {\n \t\t\t\t\tretval.tree = root_0;\n@@ -9503,15 +9503,15 @@\n \t\t\t\t\tname=descFuncNames();\n \t\t\t\t\tstate._fsp--;\n \t\t\t\t\tif (state.failed) return retval;\n \t\t\t\t\tif ( state.backtracking==0 ) stream_descFuncNames.add(name.getTree());\n \t\t\t\t\t}\n \n \t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t// elements: name, KW_EXTENDED\n+\t\t\t\t\t// elements: KW_EXTENDED, name\n \t\t\t\t\t// token labels: \n \t\t\t\t\t// rule labels: name, retval\n \t\t\t\t\t// token list labels: \n \t\t\t\t\t// rule list labels: \n \t\t\t\t\t// wildcard labels: \n \t\t\t\t\tif ( state.backtracking==0 ) {\n \t\t\t\t\tretval.tree = root_0;\n@@ -9840,15 +9840,15 @@\n \n \t\t\t\t\t\t\t}\n \t\t\t\t\t\t\tbreak;\n \n \t\t\t\t\t}\n \n \t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t// elements: noscan, parttype, statsColumnName, KW_COLUMNS\n+\t\t\t\t\t// elements: KW_COLUMNS, parttype, statsColumnName, noscan\n \t\t\t\t\t// token labels: noscan\n \t\t\t\t\t// rule labels: statsColumnName, parttype, retval\n \t\t\t\t\t// token list labels: \n \t\t\t\t\t// rule list labels: \n \t\t\t\t\t// wildcard labels: \n \t\t\t\t\tif ( state.backtracking==0 ) {\n \t\t\t\t\tretval.tree = root_0;\n@@ -10505,15 +10505,15 @@\n \t\t\t\t\t\t\tif ( state.backtracking==0 ) stream_showTablesFilterExpr.add(filter.getTree());\n \t\t\t\t\t\t\t}\n \t\t\t\t\t\t\tbreak;\n \n \t\t\t\t\t}\n \n \t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t// elements: isExtended, filter, db_name\n+\t\t\t\t\t// elements: db_name, filter, isExtended\n \t\t\t\t\t// token labels: isExtended\n \t\t\t\t\t// rule labels: filter, db_name, retval\n \t\t\t\t\t// token list labels: \n \t\t\t\t\t// rule list labels: \n \t\t\t\t\t// wildcard labels: \n \t\t\t\t\tif ( state.backtracking==0 ) {\n \t\t\t\t\tretval.tree = root_0;\n@@ -10806,15 +10806,15 @@\n \t\t\t\t\t\t\tif ( state.backtracking==0 ) stream_showStmtIdentifier.add(showStmtIdentifier250.getTree());\n \t\t\t\t\t\t\t}\n \t\t\t\t\t\t\tbreak;\n \n \t\t\t\t\t}\n \n \t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t// elements: showStmtIdentifier, db_name\n+\t\t\t\t\t// elements: db_name, showStmtIdentifier\n \t\t\t\t\t// token labels: \n \t\t\t\t\t// rule labels: db_name, retval\n \t\t\t\t\t// token list labels: \n \t\t\t\t\t// rule list labels: \n \t\t\t\t\t// wildcard labels: \n \t\t\t\t\tif ( state.backtracking==0 ) {\n \t\t\t\t\tretval.tree = root_0;\n@@ -11011,15 +11011,15 @@\n \t\t\t\t\t\t\tif ( state.backtracking==0 ) stream_showStmtIdentifier.add(showStmtIdentifier261.getTree());\n \t\t\t\t\t\t\t}\n \t\t\t\t\t\t\tbreak;\n \n \t\t\t\t\t}\n \n \t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t// elements: db_name, KW_SORTED, tableName, showStmtIdentifier\n+\t\t\t\t\t// elements: showStmtIdentifier, db_name, KW_SORTED, tableName\n \t\t\t\t\t// token labels: \n \t\t\t\t\t// rule labels: db_name, retval\n \t\t\t\t\t// token list labels: \n \t\t\t\t\t// rule list labels: \n \t\t\t\t\t// wildcard labels: \n \t\t\t\t\tif ( state.backtracking==0 ) {\n \t\t\t\t\tretval.tree = root_0;\n@@ -11093,15 +11093,15 @@\n \t\t\t\t\t\t\tif ( state.backtracking==0 ) stream_showFunctionIdentifier.add(showFunctionIdentifier265.getTree());\n \t\t\t\t\t\t\t}\n \t\t\t\t\t\t\tbreak;\n \n \t\t\t\t\t}\n \n \t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t// elements: KW_LIKE, showFunctionIdentifier\n+\t\t\t\t\t// elements: showFunctionIdentifier, KW_LIKE\n \t\t\t\t\t// token labels: \n \t\t\t\t\t// rule labels: retval\n \t\t\t\t\t// token list labels: \n \t\t\t\t\t// rule list labels: \n \t\t\t\t\t// wildcard labels: \n \t\t\t\t\tif ( state.backtracking==0 ) {\n \t\t\t\t\tretval.tree = root_0;\n@@ -11228,15 +11228,15 @@\n \t\t\t\t\t\t\tif ( state.backtracking==0 ) stream_limitClause.add(limitClause271.getTree());\n \t\t\t\t\t\t\t}\n \t\t\t\t\t\t\tbreak;\n \n \t\t\t\t\t}\n \n \t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t// elements: whereClause, limitClause, orderByClause, tabName, partitionSpec\n+\t\t\t\t\t// elements: limitClause, orderByClause, partitionSpec, whereClause, tabName\n \t\t\t\t\t// token labels: \n \t\t\t\t\t// rule labels: tabName, retval\n \t\t\t\t\t// token list labels: \n \t\t\t\t\t// rule list labels: \n \t\t\t\t\t// wildcard labels: \n \t\t\t\t\tif ( state.backtracking==0 ) {\n \t\t\t\t\tretval.tree = root_0;\n@@ -11533,15 +11533,15 @@\n \t\t\t\t\t\t\tif ( state.backtracking==0 ) stream_partitionSpec.add(partitionSpec284.getTree());\n \t\t\t\t\t\t\t}\n \t\t\t\t\t\t\tbreak;\n \n \t\t\t\t\t}\n \n \t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t// elements: db_name, showStmtIdentifier, partitionSpec\n+\t\t\t\t\t// elements: partitionSpec, showStmtIdentifier, db_name\n \t\t\t\t\t// token labels: \n \t\t\t\t\t// rule labels: db_name, retval\n \t\t\t\t\t// token list labels: \n \t\t\t\t\t// rule list labels: \n \t\t\t\t\t// wildcard labels: \n \t\t\t\t\tif ( state.backtracking==0 ) {\n \t\t\t\t\tretval.tree = root_0;\n@@ -11614,15 +11614,15 @@\n \n \t\t\t\t\t\t\t}\n \t\t\t\t\t\t\tbreak;\n \n \t\t\t\t\t}\n \n \t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t// elements: tableName, prptyName\n+\t\t\t\t\t// elements: prptyName, tableName\n \t\t\t\t\t// token labels: prptyName\n \t\t\t\t\t// rule labels: retval\n \t\t\t\t\t// token list labels: \n \t\t\t\t\t// rule list labels: \n \t\t\t\t\t// wildcard labels: \n \t\t\t\t\tif ( state.backtracking==0 ) {\n \t\t\t\t\tretval.tree = root_0;\n@@ -12027,15 +12027,15 @@\n \n \t\t\t\t\t\t\t\t\t}\n \t\t\t\t\t\t\t\t\tbreak;\n \n \t\t\t\t\t\t\t}\n \n \t\t\t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t\t\t// elements: dbName, isExtended\n+\t\t\t\t\t\t\t// elements: isExtended, dbName\n \t\t\t\t\t\t\t// token labels: isExtended\n \t\t\t\t\t\t\t// rule labels: dbName, retval\n \t\t\t\t\t\t\t// token list labels: \n \t\t\t\t\t\t\t// rule list labels: \n \t\t\t\t\t\t\t// wildcard labels: \n \t\t\t\t\t\t\tif ( state.backtracking==0 ) {\n \t\t\t\t\t\t\tretval.tree = root_0;\n@@ -12106,15 +12106,15 @@\n \n \t\t\t\t\t\t\t\t\t}\n \t\t\t\t\t\t\t\t\tbreak;\n \n \t\t\t\t\t\t\t}\n \n \t\t\t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t\t\t// elements: parttype, isExtended\n+\t\t\t\t\t\t\t// elements: isExtended, parttype\n \t\t\t\t\t\t\t// token labels: isExtended\n \t\t\t\t\t\t\t// rule labels: parttype, retval\n \t\t\t\t\t\t\t// token list labels: \n \t\t\t\t\t\t\t// rule list labels: \n \t\t\t\t\t\t\t// wildcard labels: \n \t\t\t\t\t\t\tif ( state.backtracking==0 ) {\n \t\t\t\t\t\t\tretval.tree = root_0;\n@@ -12355,15 +12355,15 @@\n \t\t\t\t\t\t\t\t\tif ( state.backtracking==0 ) stream_limitClause.add(limitClause303.getTree());\n \t\t\t\t\t\t\t\t\t}\n \t\t\t\t\t\t\t\t\tbreak;\n \n \t\t\t\t\t\t\t}\n \n \t\t\t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t\t\t// elements: dbName, compactionStatus, limitClause, compactionPool, orderByClause, compactionType\n+\t\t\t\t\t\t\t// elements: dbName, compactionType, compactionStatus, compactionPool, limitClause, orderByClause\n \t\t\t\t\t\t\t// token labels: \n \t\t\t\t\t\t\t// rule labels: dbName, retval\n \t\t\t\t\t\t\t// token list labels: \n \t\t\t\t\t\t\t// rule list labels: \n \t\t\t\t\t\t\t// wildcard labels: \n \t\t\t\t\t\t\tif ( state.backtracking==0 ) {\n \t\t\t\t\t\t\tretval.tree = root_0;\n@@ -12805,15 +12805,15 @@\n \t\t\t\t\t\t\t\t\tif ( state.backtracking==0 ) stream_limitClause.add(limitClause308.getTree());\n \t\t\t\t\t\t\t\t\t}\n \t\t\t\t\t\t\t\t\tbreak;\n \n \t\t\t\t\t\t\t}\n \n \t\t\t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t\t\t// elements: compactionStatus, limitClause, compactionType, compactionPool, parttype, orderByClause\n+\t\t\t\t\t\t\t// elements: parttype, compactionType, limitClause, orderByClause, compactionPool, compactionStatus\n \t\t\t\t\t\t\t// token labels: \n \t\t\t\t\t\t\t// rule labels: parttype, retval\n \t\t\t\t\t\t\t// token list labels: \n \t\t\t\t\t\t\t// rule list labels: \n \t\t\t\t\t\t\t// wildcard labels: \n \t\t\t\t\t\t\tif ( state.backtracking==0 ) {\n \t\t\t\t\t\t\tretval.tree = root_0;\n@@ -13455,15 +13455,15 @@\n \t\t\t\t\tEQUAL322=(Token)match(input,EQUAL,FOLLOW_EQUAL_in_showTablesFilterExpr6030); if (state.failed) return retval; \n \t\t\t\t\tif ( state.backtracking==0 ) stream_EQUAL.add(EQUAL322);\n \n \t\t\t\t\tStringLiteral323=(Token)match(input,StringLiteral,FOLLOW_StringLiteral_in_showTablesFilterExpr6032); if (state.failed) return retval; \n \t\t\t\t\tif ( state.backtracking==0 ) stream_StringLiteral.add(StringLiteral323);\n \n \t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t// elements: identifier, StringLiteral\n+\t\t\t\t\t// elements: StringLiteral, identifier\n \t\t\t\t\t// token labels: \n \t\t\t\t\t// rule labels: retval\n \t\t\t\t\t// token list labels: \n \t\t\t\t\t// rule list labels: \n \t\t\t\t\t// wildcard labels: \n \t\t\t\t\tif ( state.backtracking==0 ) {\n \t\t\t\t\tretval.tree = root_0;\n@@ -13631,15 +13631,15 @@\n \n \t\t\tpushFollow(FOLLOW_lockMode_in_lockStatement6102);\n \t\t\tlockMode331=lockMode();\n \t\t\tstate._fsp--;\n \t\t\tif (state.failed) return retval;\n \t\t\tif ( state.backtracking==0 ) stream_lockMode.add(lockMode331.getTree());\n \t\t\t// AST REWRITE\n-\t\t\t// elements: tableName, lockMode, partitionSpec\n+\t\t\t// elements: lockMode, tableName, partitionSpec\n \t\t\t// token labels: \n \t\t\t// rule labels: retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -14434,15 +14434,15 @@\n \t\t\t\t\tif ( state.backtracking==0 ) stream_withGrantOption.add(withGrantOption352.getTree());\n \t\t\t\t\t}\n \t\t\t\t\tbreak;\n \n \t\t\t}\n \n \t\t\t// AST REWRITE\n-\t\t\t// elements: privList, privilegeObject, principalSpecification, withGrantOption\n+\t\t\t// elements: withGrantOption, privList, principalSpecification, privilegeObject\n \t\t\t// token labels: \n \t\t\t// rule labels: privList, retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -14592,15 +14592,15 @@\n \n \t\t\tpushFollow(FOLLOW_principalSpecification_in_revokePrivileges6503);\n \t\t\tprincipalSpecification358=principalSpecification();\n \t\t\tstate._fsp--;\n \t\t\tif (state.failed) return retval;\n \t\t\tif ( state.backtracking==0 ) stream_principalSpecification.add(principalSpecification358.getTree());\n \t\t\t// AST REWRITE\n-\t\t\t// elements: grantOptionFor, principalSpecification, privilegeObject, privilegeList\n+\t\t\t// elements: principalSpecification, privilegeObject, grantOptionFor, privilegeList\n \t\t\t// token labels: \n \t\t\t// rule labels: retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -14783,15 +14783,15 @@\n \t\t\t\t\tif ( state.backtracking==0 ) stream_withAdminOption.add(withAdminOption366.getTree());\n \t\t\t\t\t}\n \t\t\t\t\tbreak;\n \n \t\t\t}\n \n \t\t\t// AST REWRITE\n-\t\t\t// elements: identifier, principalSpecification, withAdminOption\n+\t\t\t// elements: principalSpecification, withAdminOption, identifier\n \t\t\t// token labels: \n \t\t\t// rule labels: retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -14979,15 +14979,15 @@\n \n \t\t\tpushFollow(FOLLOW_principalSpecification_in_revokeRole6633);\n \t\t\tprincipalSpecification374=principalSpecification();\n \t\t\tstate._fsp--;\n \t\t\tif (state.failed) return retval;\n \t\t\tif ( state.backtracking==0 ) stream_principalSpecification.add(principalSpecification374.getTree());\n \t\t\t// AST REWRITE\n-\t\t\t// elements: identifier, principalSpecification, adminOptionFor\n+\t\t\t// elements: adminOptionFor, identifier, principalSpecification\n \t\t\t// token labels: \n \t\t\t// rule labels: retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -16489,15 +16489,15 @@\n \t\t\t\t\t\t\tif ( state.backtracking==0 ) stream_partitionSpec.add(partitionSpec403.getTree());\n \t\t\t\t\t\t\t}\n \t\t\t\t\t\t\tbreak;\n \n \t\t\t\t\t}\n \n \t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t// elements: partitionSpec, tableName\n+\t\t\t\t\t// elements: tableName, partitionSpec\n \t\t\t\t\t// token labels: \n \t\t\t\t\t// rule labels: retval\n \t\t\t\t\t// token list labels: \n \t\t\t\t\t// rule list labels: \n \t\t\t\t\t// wildcard labels: \n \t\t\t\t\tif ( state.backtracking==0 ) {\n \t\t\t\t\tretval.tree = root_0;\n@@ -17160,15 +17160,15 @@\n \t\t\t\t\t\t\tif ( state.backtracking==0 ) stream_partitionSpec.add(partitionSpec414.getTree());\n \t\t\t\t\t\t\t}\n \t\t\t\t\t\t\tbreak;\n \n \t\t\t\t\t}\n \n \t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t// elements: tableName, cols, partitionSpec\n+\t\t\t\t\t// elements: cols, partitionSpec, tableName\n \t\t\t\t\t// token labels: \n \t\t\t\t\t// rule labels: cols, retval\n \t\t\t\t\t// token list labels: \n \t\t\t\t\t// rule list labels: \n \t\t\t\t\t// wildcard labels: \n \t\t\t\t\tif ( state.backtracking==0 ) {\n \t\t\t\t\tretval.tree = root_0;\n@@ -18900,15 +18900,15 @@\n \t\t\t\t\tbreak;\n \n \t\t\t}\n \n \t\t\t}\n \n \t\t\t// AST REWRITE\n-\t\t\t// elements: partitionSelectorSpec, opt, tableName, repair\n+\t\t\t// elements: partitionSelectorSpec, tableName, repair, opt\n \t\t\t// token labels: repair, opt\n \t\t\t// rule labels: retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -19472,15 +19472,15 @@\n \t\t\t\t\tif ( state.backtracking==0 ) stream_resourceList.add(rList.getTree());\n \t\t\t\t\t}\n \t\t\t\t\tbreak;\n \n \t\t\t}\n \n \t\t\t// AST REWRITE\n-\t\t\t// elements: StringLiteral, functionIdentifier, functionIdentifier, rList, rList, StringLiteral\n+\t\t\t// elements: rList, StringLiteral, rList, functionIdentifier, functionIdentifier, StringLiteral\n \t\t\t// token labels: \n \t\t\t// rule labels: rList, retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -19922,15 +19922,15 @@\n \n \t\t\tpushFollow(FOLLOW_expression_in_createMacroStatement8367);\n \t\t\texpression485=expression();\n \t\t\tstate._fsp--;\n \t\t\tif (state.failed) return retval;\n \t\t\tif ( state.backtracking==0 ) stream_expression.add(expression485.getTree());\n \t\t\t// AST REWRITE\n-\t\t\t// elements: Identifier, expression, columnNameTypeList\n+\t\t\t// elements: columnNameTypeList, Identifier, expression\n \t\t\t// token labels: \n \t\t\t// rule labels: retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -20049,15 +20049,15 @@\n \n \t\t\t}\n \n \t\t\tIdentifier490=(Token)match(input,Identifier,FOLLOW_Identifier_in_dropMacroStatement8420); if (state.failed) return retval; \n \t\t\tif ( state.backtracking==0 ) stream_Identifier.add(Identifier490);\n \n \t\t\t// AST REWRITE\n-\t\t\t// elements: Identifier, ifExists\n+\t\t\t// elements: ifExists, Identifier\n \t\t\t// token labels: \n \t\t\t// rule labels: retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -20307,15 +20307,15 @@\n \n \t\t\tpushFollow(FOLLOW_selectStatementWithCTE_in_createViewStatement8532);\n \t\t\tselectStatementWithCTE502=selectStatementWithCTE();\n \t\t\tstate._fsp--;\n \t\t\tif (state.failed) return retval;\n \t\t\tif ( state.backtracking==0 ) stream_selectStatementWithCTE.add(selectStatementWithCTE502.getTree());\n \t\t\t// AST REWRITE\n-\t\t\t// elements: ifNotExists, tableComment, viewPartition, columnNameCommentList, tablePropertiesPrefixed, selectStatementWithCTE, name, orReplace\n+\t\t\t// elements: columnNameCommentList, viewPartition, tablePropertiesPrefixed, orReplace, selectStatementWithCTE, tableComment, name, ifNotExists\n \t\t\t// token labels: \n \t\t\t// rule labels: name, retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -21164,15 +21164,15 @@\n \n \t\t\tpushFollow(FOLLOW_viewName_in_dropViewStatement8957);\n \t\t\tviewName533=viewName();\n \t\t\tstate._fsp--;\n \t\t\tif (state.failed) return retval;\n \t\t\tif ( state.backtracking==0 ) stream_viewName.add(viewName533.getTree());\n \t\t\t// AST REWRITE\n-\t\t\t// elements: ifExists, viewName\n+\t\t\t// elements: viewName, ifExists\n \t\t\t// token labels: \n \t\t\t// rule labels: retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -21482,15 +21482,15 @@\n \n \t\t\tpushFollow(FOLLOW_selectStatementWithCTE_in_createMaterializedViewStatement9060);\n \t\t\tselectStatementWithCTE547=selectStatementWithCTE();\n \t\t\tstate._fsp--;\n \t\t\tif (state.failed) return retval;\n \t\t\tif ( state.backtracking==0 ) stream_selectStatementWithCTE.add(selectStatementWithCTE547.getTree());\n \t\t\t// AST REWRITE\n-\t\t\t// elements: tableRowFormat, ifNotExists, selectStatementWithCTE, name, rewriteDisabled, tableComment, tableLocation, viewPartition, tableFileFormat, tablePropertiesPrefixed, viewOrganization\n+\t\t\t// elements: tableComment, tableLocation, viewOrganization, tablePropertiesPrefixed, viewPartition, selectStatementWithCTE, name, tableRowFormat, ifNotExists, rewriteDisabled, tableFileFormat\n \t\t\t// token labels: \n \t\t\t// rule labels: name, retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -21822,15 +21822,15 @@\n \n \t\t\tpushFollow(FOLLOW_definedAsSpec_in_createScheduledQueryStatement9325);\n \t\t\tdefinedAsSpec559=definedAsSpec();\n \t\t\tstate._fsp--;\n \t\t\tif (state.failed) return retval;\n \t\t\tif ( state.backtracking==0 ) stream_definedAsSpec.add(definedAsSpec559.getTree());\n \t\t\t// AST REWRITE\n-\t\t\t// elements: enableSpecification, scheduleSpec, name, executedAsSpec, definedAsSpec\n+\t\t\t// elements: enableSpecification, name, scheduleSpec, executedAsSpec, definedAsSpec\n \t\t\t// token labels: \n \t\t\t// rule labels: name, retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -22043,15 +22043,15 @@\n \t\t\tif ( state.backtracking==0 ) stream_identifier.add(name.getTree());\n \t\t\tpushFollow(FOLLOW_alterScheduledQueryChange_in_alterScheduledQueryStatement9538);\n \t\t\tmod=alterScheduledQueryChange();\n \t\t\tstate._fsp--;\n \t\t\tif (state.failed) return retval;\n \t\t\tif ( state.backtracking==0 ) stream_alterScheduledQueryChange.add(mod.getTree());\n \t\t\t// AST REWRITE\n-\t\t\t// elements: mod, name\n+\t\t\t// elements: name, mod\n \t\t\t// token labels: \n \t\t\t// rule labels: mod, name, retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -24490,15 +24490,15 @@\n \t\t\t\t\tstate._fsp--;\n \t\t\t\t\tif (state.failed) return retval;\n \t\t\t\t\tif ( state.backtracking==0 ) stream_columnName.add(columnName625.getTree());\n \t\t\t\t\tRPAREN626=(Token)match(input,RPAREN,FOLLOW_RPAREN_in_partitionTransformType10482); if (state.failed) return retval; \n \t\t\t\t\tif ( state.backtracking==0 ) stream_RPAREN.add(RPAREN626);\n \n \t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t// elements: columnName, value\n+\t\t\t\t\t// elements: value, columnName\n \t\t\t\t\t// token labels: value\n \t\t\t\t\t// rule labels: retval\n \t\t\t\t\t// token list labels: \n \t\t\t\t\t// rule list labels: \n \t\t\t\t\t// wildcard labels: \n \t\t\t\t\tif ( state.backtracking==0 ) {\n \t\t\t\t\tretval.tree = root_0;\n@@ -24725,15 +24725,15 @@\n \t\t\tnum=(Token)match(input,Number,FOLLOW_Number_in_tableBuckets10620); if (state.failed) return retval; \n \t\t\tif ( state.backtracking==0 ) stream_Number.add(num);\n \n \t\t\tKW_BUCKETS641=(Token)match(input,KW_BUCKETS,FOLLOW_KW_BUCKETS_in_tableBuckets10622); if (state.failed) return retval; \n \t\t\tif ( state.backtracking==0 ) stream_KW_BUCKETS.add(KW_BUCKETS641);\n \n \t\t\t// AST REWRITE\n-\t\t\t// elements: sortCols, bucketCols, num\n+\t\t\t// elements: bucketCols, sortCols, num\n \t\t\t// token labels: num\n \t\t\t// rule labels: bucketCols, sortCols, retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -24998,15 +24998,15 @@\n \t\t\t\t\tif ( state.backtracking==0 ) stream_storedAsDirs.add(storedAsDirs652.getTree());\n \t\t\t\t\t}\n \t\t\t\t\tbreak;\n \n \t\t\t}\n \n \t\t\t// AST REWRITE\n-\t\t\t// elements: storedAsDirs, skewedCols, skewedValues\n+\t\t\t// elements: skewedCols, skewedValues, storedAsDirs\n \t\t\t// token labels: \n \t\t\t// rule labels: skewedCols, skewedValues, retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -25834,15 +25834,15 @@\n \t\t\t\t\tif ( state.backtracking==0 ) stream_tableRowNullFormat.add(tableRowNullFormat671.getTree());\n \t\t\t\t\t}\n \t\t\t\t\tbreak;\n \n \t\t\t}\n \n \t\t\t// AST REWRITE\n-\t\t\t// elements: tableRowFormatCollItemsIdentifier, tableRowFormatLinesIdentifier, tableRowNullFormat, tableRowFormatFieldIdentifier, tableRowFormatMapKeysIdentifier\n+\t\t\t// elements: tableRowFormatMapKeysIdentifier, tableRowFormatLinesIdentifier, tableRowNullFormat, tableRowFormatFieldIdentifier, tableRowFormatCollItemsIdentifier\n \t\t\t// token labels: \n \t\t\t// rule labels: retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -26520,15 +26520,15 @@\n \t\t\tEQUAL685=(Token)match(input,EQUAL,FOLLOW_EQUAL_in_keyValueProperty11351); if (state.failed) return retval; \n \t\t\tif ( state.backtracking==0 ) stream_EQUAL.add(EQUAL685);\n \n \t\t\tvalue=(Token)match(input,StringLiteral,FOLLOW_StringLiteral_in_keyValueProperty11355); if (state.failed) return retval; \n \t\t\tif ( state.backtracking==0 ) stream_StringLiteral.add(value);\n \n \t\t\t// AST REWRITE\n-\t\t\t// elements: key, value\n+\t\t\t// elements: value, key\n \t\t\t// token labels: value, key\n \t\t\t// rule labels: retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -27420,15 +27420,15 @@\n \n \t\t\t\t\t\t\t}\n \t\t\t\t\t\t\tbreak;\n \n \t\t\t\t\t}\n \n \t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t// elements: inFmt, inDriver, outDriver, outFmt\n+\t\t\t\t\t// elements: outFmt, outDriver, inDriver, inFmt\n \t\t\t\t\t// token labels: inFmt, inDriver, outDriver, outFmt\n \t\t\t\t\t// rule labels: retval\n \t\t\t\t\t// token list labels: \n \t\t\t\t\t// rule list labels: \n \t\t\t\t\t// wildcard labels: \n \t\t\t\t\tif ( state.backtracking==0 ) {\n \t\t\t\t\tretval.tree = root_0;\n@@ -27531,15 +27531,15 @@\n \t\t\t\t\t\t\tif ( state.backtracking==0 ) stream_identifier.add(fileformat.getTree());\n \t\t\t\t\t\t\t}\n \t\t\t\t\t\t\tbreak;\n \n \t\t\t\t\t}\n \n \t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t// elements: storageHandler, serdeprops, fileformat\n+\t\t\t\t\t// elements: fileformat, storageHandler, serdeprops\n \t\t\t\t\t// token labels: storageHandler\n \t\t\t\t\t// rule labels: serdeprops, fileformat, retval\n \t\t\t\t\t// token list labels: \n \t\t\t\t\t// rule list labels: \n \t\t\t\t\t// wildcard labels: \n \t\t\t\t\tif ( state.backtracking==0 ) {\n \t\t\t\t\tretval.tree = root_0;\n@@ -27649,25 +27649,25 @@\n \t\t\t\t\t\t\tif ( state.backtracking==0 ) stream_identifier.add(fileformat.getTree());\n \t\t\t\t\t\t\t}\n \t\t\t\t\t\t\tbreak;\n \n \t\t\t\t\t}\n \n \t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t// elements: fileformat, genericSpec, serdeprops\n+\t\t\t\t\t// elements: serdeprops, genericSpec, fileformat\n \t\t\t\t\t// token labels: \n-\t\t\t\t\t// rule labels: serdeprops, fileformat, genericSpec, retval\n+\t\t\t\t\t// rule labels: serdeprops, genericSpec, fileformat, retval\n \t\t\t\t\t// token list labels: \n \t\t\t\t\t// rule list labels: \n \t\t\t\t\t// wildcard labels: \n \t\t\t\t\tif ( state.backtracking==0 ) {\n \t\t\t\t\tretval.tree = root_0;\n \t\t\t\t\tRewriteRuleSubtreeStream stream_serdeprops=new RewriteRuleSubtreeStream(adaptor,\"rule serdeprops\",serdeprops!=null?serdeprops.getTree():null);\n-\t\t\t\t\tRewriteRuleSubtreeStream stream_fileformat=new RewriteRuleSubtreeStream(adaptor,\"rule fileformat\",fileformat!=null?fileformat.getTree():null);\n \t\t\t\t\tRewriteRuleSubtreeStream stream_genericSpec=new RewriteRuleSubtreeStream(adaptor,\"rule genericSpec\",genericSpec!=null?genericSpec.getTree():null);\n+\t\t\t\t\tRewriteRuleSubtreeStream stream_fileformat=new RewriteRuleSubtreeStream(adaptor,\"rule fileformat\",fileformat!=null?fileformat.getTree():null);\n \t\t\t\t\tRewriteRuleSubtreeStream stream_retval=new RewriteRuleSubtreeStream(adaptor,\"rule retval\",retval!=null?retval.getTree():null);\n \n \t\t\t\t\troot_0 = (ASTNode)adaptor.nil();\n \t\t\t\t\t// 2083:7: -> ^( TOK_STORAGEHANDLER $genericSpec ( $serdeprops)? ( ^( TOK_FILEFORMAT_GENERIC $fileformat) )? )\n \t\t\t\t\t{\n \t\t\t\t\t\t// org/apache/hadoop/hive/ql/parse/HiveParser.g:2083:10: ^( TOK_STORAGEHANDLER $genericSpec ( $serdeprops)? ( ^( TOK_FILEFORMAT_GENERIC $fileformat) )? )\n \t\t\t\t\t\t{\n@@ -29751,15 +29751,15 @@\n \t\t\t\t\tif ( state.backtracking==0 ) stream_constraintOptsCreate.add(constraintOptsCreate765.getTree());\n \t\t\t\t\t}\n \t\t\t\t\tbreak;\n \n \t\t\t}\n \n \t\t\t// AST REWRITE\n-\t\t\t// elements: constraintName, constraintOptsCreate, constraintOptsCreate\n+\t\t\t// elements: constraintOptsCreate, constraintOptsCreate, constraintName\n \t\t\t// token labels: \n \t\t\t// rule labels: constraintName, retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -30351,15 +30351,15 @@\n \t\t\t\t\tif ( state.backtracking==0 ) stream_constraintOptsCreate.add(constraintOptsCreate780.getTree());\n \t\t\t\t\t}\n \t\t\t\t\tbreak;\n \n \t\t\t}\n \n \t\t\t// AST REWRITE\n-\t\t\t// elements: tabName, parCols, fkCols, tabName, parCols, constraintOptsCreate, fkCols, constraintName, constraintOptsCreate\n+\t\t\t// elements: tabName, constraintOptsCreate, constraintName, parCols, constraintOptsCreate, fkCols, tabName, parCols, fkCols\n \t\t\t// token labels: \n \t\t\t// rule labels: tabName, parCols, fkCols, constraintName, retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -30538,15 +30538,15 @@\n \t\t\t\t\tif ( state.backtracking==0 ) stream_constraintOptsAlter.add(constraintOptsAlter785.getTree());\n \t\t\t\t\t}\n \t\t\t\t\tbreak;\n \n \t\t\t}\n \n \t\t\t// AST REWRITE\n-\t\t\t// elements: tabName, constraintOptsAlter, parCols, constraintName, fkCols\n+\t\t\t// elements: tabName, parCols, constraintName, fkCols, constraintOptsAlter\n \t\t\t// token labels: \n \t\t\t// rule labels: tabName, parCols, fkCols, constraintName, retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -31485,15 +31485,15 @@\n \t\t\t\t\tif ( state.backtracking==0 ) stream_nullOrdering.add(nullSpec.getTree());\n \t\t\t\t\t}\n \t\t\t\t\tbreak;\n \n \t\t\t}\n \n \t\t\t// AST REWRITE\n-\t\t\t// elements: identifier, nullSpec, nullSpec, identifier, identifier, nullSpec, identifier, identifier, identifier\n+\t\t\t// elements: nullSpec, nullSpec, identifier, identifier, identifier, nullSpec, identifier, identifier, identifier\n \t\t\t// token labels: \n \t\t\t// rule labels: nullSpec, retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -31817,15 +31817,15 @@\n \n \t\t\t\t\t}\n \t\t\t\t\tbreak;\n \n \t\t\t}\n \n \t\t\t// AST REWRITE\n-\t\t\t// elements: colName, comment\n+\t\t\t// elements: comment, colName\n \t\t\t// token labels: comment\n \t\t\t// rule labels: colName, retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -32088,15 +32088,15 @@\n \t\t\t\t\tif ( state.backtracking==0 ) stream_nullOrdering.add(nullSpec.getTree());\n \t\t\t\t\t}\n \t\t\t\t\tbreak;\n \n \t\t\t}\n \n \t\t\t// AST REWRITE\n-\t\t\t// elements: expression, orderSpec, orderSpec, expression, nullSpec, expression, expression, orderSpec, expression, orderSpec, expression, expression, orderSpec, nullSpec, expression\n+\t\t\t// elements: nullSpec, orderSpec, expression, expression, expression, expression, expression, nullSpec, expression, orderSpec, orderSpec, expression, orderSpec, orderSpec, expression\n \t\t\t// token labels: \n \t\t\t// rule labels: orderSpec, nullSpec, retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -32345,15 +32345,15 @@\n \n \t\t\t\t\t}\n \t\t\t\t\tbreak;\n \n \t\t\t}\n \n \t\t\t// AST REWRITE\n-\t\t\t// elements: colName, comment, colType, colType, colName\n+\t\t\t// elements: colType, colName, colType, comment, colName\n \t\t\t// token labels: comment\n \t\t\t// rule labels: colName, retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -33043,15 +33043,15 @@\n \n \t\t\t\t\t}\n \t\t\t\t\tbreak;\n \n \t\t\t}\n \n \t\t\t// AST REWRITE\n-\t\t\t// elements: colName, comment, colType, columnConstraint\n+\t\t\t// elements: comment, colType, colName, columnConstraint\n \t\t\t// token labels: comment\n \t\t\t// rule labels: colName, retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -33395,15 +33395,15 @@\n \t\t\t\t\tif ( state.backtracking==0 ) stream_constraintOptsCreate.add(constraintOptsCreate827.getTree());\n \t\t\t\t\t}\n \t\t\t\t\tbreak;\n \n \t\t\t}\n \n \t\t\t// AST REWRITE\n-\t\t\t// elements: constraintOptsCreate, constraintName, colName, colName, tabName, tabName, constraintOptsCreate\n+\t\t\t// elements: constraintName, constraintOptsCreate, colName, constraintOptsCreate, tabName, colName, tabName\n \t\t\t// token labels: \n \t\t\t// rule labels: colName, tabName, constraintName, retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -33593,15 +33593,15 @@\n \t\t\t\t\tif ( state.backtracking==0 ) stream_constraintOptsCreate.add(constraintOptsCreate830.getTree());\n \t\t\t\t\t}\n \t\t\t\t\tbreak;\n \n \t\t\t}\n \n \t\t\t// AST REWRITE\n-\t\t\t// elements: constraintOptsCreate, constraintName, constraintOptsCreate\n+\t\t\t// elements: constraintName, constraintOptsCreate, constraintOptsCreate\n \t\t\t// token labels: \n \t\t\t// rule labels: constraintName, retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -33956,24 +33956,24 @@\n \t\t\t\t\tif ( state.backtracking==0 ) stream_constraintOptsAlter.add(constraintOptsAlter837.getTree());\n \t\t\t\t\t}\n \t\t\t\t\tbreak;\n \n \t\t\t}\n \n \t\t\t// AST REWRITE\n-\t\t\t// elements: constraintOptsAlter, colName, constraintOptsAlter, tabName, constraintName, tabName, colName\n+\t\t\t// elements: constraintName, tabName, colName, constraintOptsAlter, colName, tabName, constraintOptsAlter\n \t\t\t// token labels: \n-\t\t\t// rule labels: colName, tabName, constraintName, retval\n+\t\t\t// rule labels: tabName, colName, constraintName, retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n-\t\t\tRewriteRuleSubtreeStream stream_colName=new RewriteRuleSubtreeStream(adaptor,\"rule colName\",colName!=null?colName.getTree():null);\n \t\t\tRewriteRuleSubtreeStream stream_tabName=new RewriteRuleSubtreeStream(adaptor,\"rule tabName\",tabName!=null?tabName.getTree():null);\n+\t\t\tRewriteRuleSubtreeStream stream_colName=new RewriteRuleSubtreeStream(adaptor,\"rule colName\",colName!=null?colName.getTree():null);\n \t\t\tRewriteRuleSubtreeStream stream_constraintName=new RewriteRuleSubtreeStream(adaptor,\"rule constraintName\",constraintName!=null?constraintName.getTree():null);\n \t\t\tRewriteRuleSubtreeStream stream_retval=new RewriteRuleSubtreeStream(adaptor,\"rule retval\",retval!=null?retval.getTree():null);\n \n \t\t\troot_0 = (ASTNode)adaptor.nil();\n \t\t\t// 2419:5: -> {$constraintName.tree != null}? ^( TOK_FOREIGN_KEY ^( TOK_CONSTRAINT_NAME $constraintName) ^( TOK_TABCOLNAME ) $tabName ^( TOK_TABCOLNAME $colName) ( constraintOptsAlter )? )\n \t\t\tif ((constraintName!=null?((ASTNode)constraintName.getTree()):null) != null) {\n \t\t\t\t// org/apache/hadoop/hive/ql/parse/HiveParser.g:2420:13: ^( TOK_FOREIGN_KEY ^( TOK_CONSTRAINT_NAME $constraintName) ^( TOK_TABCOLNAME ) $tabName ^( TOK_TABCOLNAME $colName) ( constraintOptsAlter )? )\n@@ -34154,15 +34154,15 @@\n \t\t\t\t\tif ( state.backtracking==0 ) stream_constraintOptsAlter.add(constraintOptsAlter840.getTree());\n \t\t\t\t\t}\n \t\t\t\t\tbreak;\n \n \t\t\t}\n \n \t\t\t// AST REWRITE\n-\t\t\t// elements: constraintOptsAlter, constraintName, constraintOptsAlter\n+\t\t\t// elements: constraintName, constraintOptsAlter, constraintOptsAlter\n \t\t\t// token labels: \n \t\t\t// rule labels: constraintName, retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -35240,15 +35240,15 @@\n \n \t\t\t\t\t}\n \t\t\t\t\tbreak;\n \n \t\t\t}\n \n \t\t\t// AST REWRITE\n-\t\t\t// elements: colName, colType, colType, comment, colName\n+\t\t\t// elements: colType, comment, colType, colName, colName\n \t\t\t// token labels: comment\n \t\t\t// rule labels: colName, retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -36812,15 +36812,15 @@\n \t\t\tstate._fsp--;\n \t\t\tif (state.failed) return retval;\n \t\t\tif ( state.backtracking==0 ) stream_type.add(right.getTree());\n \t\t\tGREATERTHAN910=(Token)match(input,GREATERTHAN,FOLLOW_GREATERTHAN_in_mapType15881); if (state.failed) return retval; \n \t\t\tif ( state.backtracking==0 ) stream_GREATERTHAN.add(GREATERTHAN910);\n \n \t\t\t// AST REWRITE\n-\t\t\t// elements: right, left\n+\t\t\t// elements: left, right\n \t\t\t// token labels: \n \t\t\t// rule labels: left, right, retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -37919,15 +37919,15 @@\n \t\t\tstate._fsp--;\n \t\t\tif (state.failed) return retval;\n \t\t\tif ( state.backtracking==0 ) stream_queryStatementExpression.add(queryStatementExpression943.getTree());\n \t\t\tRPAREN944=(Token)match(input,RPAREN,FOLLOW_RPAREN_in_cteStatement16236); if (state.failed) return retval; \n \t\t\tif ( state.backtracking==0 ) stream_RPAREN.add(RPAREN944);\n \n \t\t\t// AST REWRITE\n-\t\t\t// elements: identifier, colAliases, queryStatementExpression\n+\t\t\t// elements: colAliases, queryStatementExpression, identifier\n \t\t\t// token labels: \n \t\t\t// rule labels: colAliases, retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -38268,15 +38268,15 @@\n \t\t\t\t\tEarlyExitException eee = new EarlyExitException(286, input);\n \t\t\t\t\tthrow eee;\n \t\t\t\t}\n \t\t\t\tcnt286++;\n \t\t\t}\n \n \t\t\t// AST REWRITE\n-\t\t\t// elements: body, fromClause\n+\t\t\t// elements: fromClause, body\n \t\t\t// token labels: \n \t\t\t// rule labels: retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -38640,27 +38640,27 @@\n \t\t\t\t\t\t\tif ( state.backtracking==0 ) stream_qualifyClause.add(q.getTree());\n \t\t\t\t\t\t\t}\n \t\t\t\t\t\t\tbreak;\n \n \t\t\t\t\t}\n \n \t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t// elements: g, h, q, win, f, w, s\n+\t\t\t\t\t// elements: f, h, w, q, g, win, s\n \t\t\t\t\t// token labels: \n-\t\t\t\t\t// rule labels: q, s, f, g, w, h, win, retval\n+\t\t\t\t\t// rule labels: q, s, f, w, g, h, win, retval\n \t\t\t\t\t// token list labels: \n \t\t\t\t\t// rule list labels: \n \t\t\t\t\t// wildcard labels: \n \t\t\t\t\tif ( state.backtracking==0 ) {\n \t\t\t\t\tretval.tree = root_0;\n \t\t\t\t\tRewriteRuleSubtreeStream stream_q=new RewriteRuleSubtreeStream(adaptor,\"rule q\",q!=null?q.getTree():null);\n \t\t\t\t\tRewriteRuleSubtreeStream stream_s=new RewriteRuleSubtreeStream(adaptor,\"rule s\",s!=null?s.getTree():null);\n \t\t\t\t\tRewriteRuleSubtreeStream stream_f=new RewriteRuleSubtreeStream(adaptor,\"rule f\",f!=null?f.getTree():null);\n-\t\t\t\t\tRewriteRuleSubtreeStream stream_g=new RewriteRuleSubtreeStream(adaptor,\"rule g\",g!=null?g.getTree():null);\n \t\t\t\t\tRewriteRuleSubtreeStream stream_w=new RewriteRuleSubtreeStream(adaptor,\"rule w\",w!=null?w.getTree():null);\n+\t\t\t\t\tRewriteRuleSubtreeStream stream_g=new RewriteRuleSubtreeStream(adaptor,\"rule g\",g!=null?g.getTree():null);\n \t\t\t\t\tRewriteRuleSubtreeStream stream_h=new RewriteRuleSubtreeStream(adaptor,\"rule h\",h!=null?h.getTree():null);\n \t\t\t\t\tRewriteRuleSubtreeStream stream_win=new RewriteRuleSubtreeStream(adaptor,\"rule win\",win!=null?win.getTree():null);\n \t\t\t\t\tRewriteRuleSubtreeStream stream_retval=new RewriteRuleSubtreeStream(adaptor,\"rule retval\",retval!=null?retval.getTree():null);\n \n \t\t\t\t\troot_0 = (ASTNode)adaptor.nil();\n \t\t\t\t\t// 2637:4: -> ^( TOK_QUERY ( $f)? ^( TOK_INSERT ^( TOK_DESTINATION ^( TOK_DIR TOK_TMP_FILE ) ) $s ( $w)? ( $g)? ( $h)? ( $win)? ( $q)? ) )\n \t\t\t\t\t{\n@@ -38957,15 +38957,15 @@\n \t\t\t   (a!=null?((ASTNode)a.getTree()):null).getFirstChildWithType(TOK_INSERT).addChild((c!=null?((ASTNode)c.getTree()):null));\n \t\t\t   (a!=null?((ASTNode)a.getTree()):null).getFirstChildWithType(TOK_INSERT).addChild((d!=null?((ASTNode)d.getTree()):null));\n \t\t\t   (a!=null?((ASTNode)a.getTree()):null).getFirstChildWithType(TOK_INSERT).addChild((sort!=null?((ASTNode)sort.getTree()):null));\n \t\t\t   (a!=null?((ASTNode)a.getTree()):null).getFirstChildWithType(TOK_INSERT).addChild((l!=null?((ASTNode)l.getTree()):null));\n \t\t\t   }\n \t\t\t   }\n \t\t\t// AST REWRITE\n-\t\t\t// elements: d, l, c, o, sort\n+\t\t\t// elements: sort, o, c, l, d\n \t\t\t// token labels: \n \t\t\t// rule labels: c, d, sort, l, retval, o\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -39154,15 +39154,15 @@\n \t\t\t\t\tif ( state.backtracking==0 ) stream_setOperator.add(u.getTree());\n \t\t\t\t\tpushFollow(FOLLOW_atomSelectStatement_in_setOpSelectStatement17074);\n \t\t\t\t\tb=atomSelectStatement();\n \t\t\t\t\tstate._fsp--;\n \t\t\t\t\tif (state.failed) return retval;\n \t\t\t\t\tif ( state.backtracking==0 ) stream_atomSelectStatement.add(b.getTree());\n \t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t// elements: b, b, u, u, b, b\n+\t\t\t\t\t// elements: b, b, u, b, u, b\n \t\t\t\t\t// token labels: \n \t\t\t\t\t// rule labels: b, u, retval\n \t\t\t\t\t// token list labels: \n \t\t\t\t\t// rule list labels: \n \t\t\t\t\t// wildcard labels: \n \t\t\t\t\tif ( state.backtracking==0 ) {\n \t\t\t\t\tretval.tree = root_0;\n@@ -39872,15 +39872,15 @@\n \t\t\t\t\t\t\tif ( state.backtracking==0 ) stream_limitClause.add(limitClause965.getTree());\n \t\t\t\t\t\t\t}\n \t\t\t\t\t\t\tbreak;\n \n \t\t\t\t\t}\n \n \t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t// elements: orderByClause, selectClause, qualifyClause, window_clause, distributeByClause, groupByClause, lateralView, clusterByClause, insertClause, sortByClause, limitClause, havingClause, whereClause\n+\t\t\t\t\t// elements: distributeByClause, insertClause, orderByClause, selectClause, clusterByClause, limitClause, qualifyClause, groupByClause, whereClause, lateralView, havingClause, sortByClause, window_clause\n \t\t\t\t\t// token labels: \n \t\t\t\t\t// rule labels: retval\n \t\t\t\t\t// token list labels: \n \t\t\t\t\t// rule list labels: \n \t\t\t\t\t// wildcard labels: \n \t\t\t\t\tif ( state.backtracking==0 ) {\n \t\t\t\t\tretval.tree = root_0;\n@@ -40197,15 +40197,15 @@\n \t\t\t\t\t\t\tif ( state.backtracking==0 ) stream_limitClause.add(limitClause977.getTree());\n \t\t\t\t\t\t\t}\n \t\t\t\t\t\t\tbreak;\n \n \t\t\t\t\t}\n \n \t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t// elements: whereClause, havingClause, selectClause, window_clause, limitClause, orderByClause, clusterByClause, distributeByClause, lateralView, groupByClause, sortByClause, qualifyClause\n+\t\t\t\t\t// elements: whereClause, sortByClause, havingClause, orderByClause, groupByClause, qualifyClause, selectClause, distributeByClause, lateralView, window_clause, limitClause, clusterByClause\n \t\t\t\t\t// token labels: \n \t\t\t\t\t// rule labels: retval\n \t\t\t\t\t// token list labels: \n \t\t\t\t\t// rule list labels: \n \t\t\t\t\t// wildcard labels: \n \t\t\t\t\tif ( state.backtracking==0 ) {\n \t\t\t\t\tretval.tree = root_0;\n@@ -40447,15 +40447,15 @@\n \t\t\t\t\t\t\tif ( state.backtracking==0 ) stream_ifNotExists.add(ifNotExists981.getTree());\n \t\t\t\t\t\t\t}\n \t\t\t\t\t\t\tbreak;\n \n \t\t\t\t\t}\n \n \t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t// elements: destination, ifNotExists\n+\t\t\t\t\t// elements: ifNotExists, destination\n \t\t\t\t\t// token labels: \n \t\t\t\t\t// rule labels: retval\n \t\t\t\t\t// token list labels: \n \t\t\t\t\t// rule list labels: \n \t\t\t\t\t// wildcard labels: \n \t\t\t\t\tif ( state.backtracking==0 ) {\n \t\t\t\t\tretval.tree = root_0;\n@@ -40727,15 +40727,15 @@\n \t\t\t\t\t\t\tif ( state.backtracking==0 ) stream_tableFileFormat.add(tableFileFormat991.getTree());\n \t\t\t\t\t\t\t}\n \t\t\t\t\t\t\tbreak;\n \n \t\t\t\t\t}\n \n \t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t// elements: StringLiteral, local, tableRowFormat, tableFileFormat\n+\t\t\t\t\t// elements: local, tableRowFormat, tableFileFormat, StringLiteral\n \t\t\t\t\t// token labels: local\n \t\t\t\t\t// rule labels: retval\n \t\t\t\t\t// token list labels: \n \t\t\t\t\t// rule list labels: \n \t\t\t\t\t// wildcard labels: \n \t\t\t\t\tif ( state.backtracking==0 ) {\n \t\t\t\t\tretval.tree = root_0;\n@@ -40961,15 +40961,15 @@\n \n \t\t\t\t\tnum=(Token)match(input,Number,FOLLOW_Number_in_limitClause18291); if (state.failed) return retval; \n \t\t\t\t\tif ( state.backtracking==0 ) stream_Number.add(num);\n \n \t\t\t\t\t}\n \n \t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t// elements: offset, num\n+\t\t\t\t\t// elements: num, offset\n \t\t\t\t\t// token labels: offset, num\n \t\t\t\t\t// rule labels: retval\n \t\t\t\t\t// token list labels: \n \t\t\t\t\t// rule list labels: \n \t\t\t\t\t// wildcard labels: \n \t\t\t\t\tif ( state.backtracking==0 ) {\n \t\t\t\t\tretval.tree = root_0;\n@@ -41014,15 +41014,15 @@\n \t\t\t\t\tKW_OFFSET997=(Token)match(input,KW_OFFSET,FOLLOW_KW_OFFSET_in_limitClause18320); if (state.failed) return retval; \n \t\t\t\t\tif ( state.backtracking==0 ) stream_KW_OFFSET.add(KW_OFFSET997);\n \n \t\t\t\t\toffset=(Token)match(input,Number,FOLLOW_Number_in_limitClause18324); if (state.failed) return retval; \n \t\t\t\t\tif ( state.backtracking==0 ) stream_Number.add(offset);\n \n \t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t// elements: num, offset\n+\t\t\t\t\t// elements: offset, num\n \t\t\t\t\t// token labels: offset, num\n \t\t\t\t\t// rule labels: retval\n \t\t\t\t\t// token list labels: \n \t\t\t\t\t// rule list labels: \n \t\t\t\t\t// wildcard labels: \n \t\t\t\t\tif ( state.backtracking==0 ) {\n \t\t\t\t\tretval.tree = root_0;\n@@ -41577,15 +41577,15 @@\n \t\t\t\t\tif ( state.backtracking==0 ) stream_whereClause.add(whereClause1014.getTree());\n \t\t\t\t\t}\n \t\t\t\t\tbreak;\n \n \t\t\t}\n \n \t\t\t// AST REWRITE\n-\t\t\t// elements: whereClause, tableName, setColumnsClause\n+\t\t\t// elements: setColumnsClause, whereClause, tableName\n \t\t\t// token labels: \n \t\t\t// rule labels: retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -43003,15 +43003,15 @@\n \t\t\tif ( state.backtracking==0 ) stream_expression.add(expression1056.getTree());\n \t\t\tpushFollow(FOLLOW_whenClauses_in_mergeStatement18976);\n \t\t\twhenClauses1057=whenClauses();\n \t\t\tstate._fsp--;\n \t\t\tif (state.failed) return retval;\n \t\t\tif ( state.backtracking==0 ) stream_whenClauses.add(whenClauses1057.getTree());\n \t\t\t// AST REWRITE\n-\t\t\t// elements: QUERY_HINT, identifier, expression, whenClauses, tableName, joinSourcePart\n+\t\t\t// elements: whenClauses, tableName, QUERY_HINT, joinSourcePart, identifier, expression\n \t\t\t// token labels: \n \t\t\t// rule labels: retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -43312,15 +43312,15 @@\n \n \t\t\tpushFollow(FOLLOW_valueRowConstructor_in_whenNotMatchedClause19084);\n \t\t\tvalueRowConstructor1069=valueRowConstructor();\n \t\t\tstate._fsp--;\n \t\t\tif (state.failed) return retval;\n \t\t\tif ( state.backtracking==0 ) stream_valueRowConstructor.add(valueRowConstructor1069.getTree());\n \t\t\t// AST REWRITE\n-\t\t\t// elements: expression, targetCols, valueRowConstructor\n+\t\t\t// elements: expression, valueRowConstructor, targetCols\n \t\t\t// token labels: \n \t\t\t// rule labels: targetCols, retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -44554,545 +44554,545 @@\n \t\t}\n \t\t}\n \n \t}\n \t// $ANTLR end synpred24_HiveParser\n \n \t// Delegated rules\n-\tpublic HiveParser_IdentifiersParser.precedenceStarOperator_return precedenceStarOperator() throws RecognitionException { return gIdentifiersParser.precedenceStarOperator(); }\n+\tpublic HiveParser_SelectClauseParser.window_frame_return window_frame() throws RecognitionException { return gSelectClauseParser.window_frame(); }\n \n-\tpublic HiveParser_ResourcePlanParser.alterPoolStatement_return alterPoolStatement() throws RecognitionException { return gResourcePlanParser.alterPoolStatement(); }\n+\tpublic HiveParser_ResourcePlanParser.createPoolStatement_return createPoolStatement() throws RecognitionException { return gResourcePlanParser.createPoolStatement(); }\n \n-\tpublic HiveParser_ResourcePlanParser.createMappingStatement_return createMappingStatement() throws RecognitionException { return gResourcePlanParser.createMappingStatement(); }\n+\tpublic HiveParser_AlterClauseParser.alterStatementSuffixSerdeProperties_return alterStatementSuffixSerdeProperties(boolean partition) throws RecognitionException { return gAlterClauseParser.alterStatementSuffixSerdeProperties(partition); }\n \n-\tpublic HiveParser_IdentifiersParser.subQuerySelectorOperator_return subQuerySelectorOperator() throws RecognitionException { return gIdentifiersParser.subQuerySelectorOperator(); }\n+\tpublic HiveParser_AlterClauseParser.alterDatabaseSuffixProperties_return alterDatabaseSuffixProperties() throws RecognitionException { return gAlterClauseParser.alterDatabaseSuffixProperties(); }\n \n-\tpublic HiveParser_IdentifiersParser.dateLiteral_return dateLiteral() throws RecognitionException { return gIdentifiersParser.dateLiteral(); }\n+\tpublic HiveParser_AlterClauseParser.skewedLocations_return skewedLocations() throws RecognitionException { return gAlterClauseParser.skewedLocations(); }\n \n-\tpublic HiveParser_ResourcePlanParser.triggerOrExpression_return triggerOrExpression() throws RecognitionException { return gResourcePlanParser.triggerOrExpression(); }\n+\tpublic HiveParser_IdentifiersParser.partitionSelectorVal_return partitionSelectorVal() throws RecognitionException { return gIdentifiersParser.partitionSelectorVal(); }\n \n-\tpublic HiveParser_IdentifiersParser.isCondition_return isCondition() throws RecognitionException { return gIdentifiersParser.isCondition(); }\n+\tpublic HiveParser_IdentifiersParser.precedenceBitwiseXorOperator_return precedenceBitwiseXorOperator() throws RecognitionException { return gIdentifiersParser.precedenceBitwiseXorOperator(); }\n \n-\tpublic HiveParser_FromClauseParser.tableSource_return tableSource() throws RecognitionException { return gFromClauseParser.tableSource(); }\n+\tpublic HiveParser_ResourcePlanParser.alterResourcePlanStatement_return alterResourcePlanStatement() throws RecognitionException { return gResourcePlanParser.alterResourcePlanStatement(); }\n \n-\tpublic HiveParser_IdentifiersParser.functionIdentifier_return functionIdentifier() throws RecognitionException { return gIdentifiersParser.functionIdentifier(); }\n+\tpublic HiveParser_AlterClauseParser.refRetain_return refRetain() throws RecognitionException { return gAlterClauseParser.refRetain(); }\n \n-\tpublic HiveParser_IdentifiersParser.precedenceEqualExpression_return precedenceEqualExpression() throws RecognitionException { return gIdentifiersParser.precedenceEqualExpression(); }\n+\tpublic HiveParser_SelectClauseParser.window_range_expression_return window_range_expression() throws RecognitionException { return gSelectClauseParser.window_range_expression(); }\n \n-\tpublic HiveParser_IdentifiersParser.partitionVal_return partitionVal() throws RecognitionException { return gIdentifiersParser.partitionVal(); }\n+\tpublic HiveParser_IdentifiersParser.prepareStmtParam_return prepareStmtParam() throws RecognitionException { return gIdentifiersParser.prepareStmtParam(); }\n \n-\tpublic HiveParser_AlterClauseParser.alterStatementSuffixSetOwner_return alterStatementSuffixSetOwner() throws RecognitionException { return gAlterClauseParser.alterStatementSuffixSetOwner(); }\n+\tpublic HiveParser_IdentifiersParser.firstExpressionsWithAlias_return firstExpressionsWithAlias() throws RecognitionException { return gIdentifiersParser.firstExpressionsWithAlias(); }\n \n-\tpublic HiveParser_IdentifiersParser.groupingSetExpressionMultiple_return groupingSetExpressionMultiple() throws RecognitionException { return gIdentifiersParser.groupingSetExpressionMultiple(); }\n+\tpublic HiveParser_ResourcePlanParser.comparisionOperator_return comparisionOperator() throws RecognitionException { return gResourcePlanParser.comparisionOperator(); }\n \n-\tpublic HiveParser_IdentifiersParser.precedenceBitwiseXorExpression_return precedenceBitwiseXorExpression() throws RecognitionException { return gIdentifiersParser.precedenceBitwiseXorExpression(); }\n+\tpublic HiveParser_IdentifiersParser.precedenceUnaryOperator_return precedenceUnaryOperator() throws RecognitionException { return gIdentifiersParser.precedenceUnaryOperator(); }\n \n-\tpublic HiveParser_AlterClauseParser.alterStatementSuffixCompact_return alterStatementSuffixCompact() throws RecognitionException { return gAlterClauseParser.alterStatementSuffixCompact(); }\n+\tpublic HiveParser_PrepareStatementParser.prepareStatement_return prepareStatement() throws RecognitionException { return gPrepareStatementParser.prepareStatement(); }\n \n-\tpublic HiveParser_SelectClauseParser.selectExpression_return selectExpression() throws RecognitionException { return gSelectClauseParser.selectExpression(); }\n+\tpublic HiveParser_IdentifiersParser.precedenceConcatenateExpression_return precedenceConcatenateExpression() throws RecognitionException { return gIdentifiersParser.precedenceConcatenateExpression(); }\n \n-\tpublic HiveParser_FromClauseParser.uniqueJoinExpr_return uniqueJoinExpr() throws RecognitionException { return gFromClauseParser.uniqueJoinExpr(); }\n+\tpublic HiveParser_IdentifiersParser.quantifierType_return quantifierType() throws RecognitionException { return gIdentifiersParser.quantifierType(); }\n \n-\tpublic HiveParser_ResourcePlanParser.alterMappingStatement_return alterMappingStatement() throws RecognitionException { return gResourcePlanParser.alterMappingStatement(); }\n+\tpublic HiveParser_ResourcePlanParser.alterPoolStatement_return alterPoolStatement() throws RecognitionException { return gResourcePlanParser.alterPoolStatement(); }\n \n-\tpublic HiveParser_AlterClauseParser.alterStatementSuffixExecute_return alterStatementSuffixExecute() throws RecognitionException { return gAlterClauseParser.alterStatementSuffixExecute(); }\n+\tpublic HiveParser_IdentifiersParser.precedenceSimilarExpressionIn_return precedenceSimilarExpressionIn(CommonTree t) throws RecognitionException { return gIdentifiersParser.precedenceSimilarExpressionIn(t); }\n \n-\tpublic HiveParser_AlterClauseParser.alterStatementSuffixUpdateStatsCol_return alterStatementSuffixUpdateStatsCol(boolean partition) throws RecognitionException { return gAlterClauseParser.alterStatementSuffixUpdateStatsCol(partition); }\n+\tpublic HiveParser_AlterClauseParser.alterDataConnectorStatementSuffix_return alterDataConnectorStatementSuffix() throws RecognitionException { return gAlterClauseParser.alterDataConnectorStatementSuffix(); }\n \n-\tpublic HiveParser_IdentifiersParser.expressionPart_return expressionPart(CommonTree firstExprTree, boolean isStruct) throws RecognitionException { return gIdentifiersParser.expressionPart(firstExprTree, isStruct); }\n+\tpublic HiveParser_AlterClauseParser.tablePartitionPrefix_return tablePartitionPrefix() throws RecognitionException { return gAlterClauseParser.tablePartitionPrefix(); }\n \n-\tpublic HiveParser_AlterClauseParser.alterTblPartitionStatementSuffixSkewedLocation_return alterTblPartitionStatementSuffixSkewedLocation() throws RecognitionException { return gAlterClauseParser.alterTblPartitionStatementSuffixSkewedLocation(); }\n+\tpublic HiveParser_AlterClauseParser.alterDatabaseSuffixSetManagedLocation_return alterDatabaseSuffixSetManagedLocation() throws RecognitionException { return gAlterClauseParser.alterDatabaseSuffixSetManagedLocation(); }\n \n-\tpublic HiveParser_IdentifiersParser.precedenceUnarySuffixExpression_return precedenceUnarySuffixExpression() throws RecognitionException { return gIdentifiersParser.precedenceUnarySuffixExpression(); }\n+\tpublic HiveParser_AlterClauseParser.skewedLocationsList_return skewedLocationsList() throws RecognitionException { return gAlterClauseParser.skewedLocationsList(); }\n \n-\tpublic HiveParser_IdentifiersParser.identifier_return identifier() throws RecognitionException { return gIdentifiersParser.identifier(); }\n+\tpublic HiveParser_SelectClauseParser.window_value_expression_return window_value_expression() throws RecognitionException { return gSelectClauseParser.window_value_expression(); }\n \n-\tpublic HiveParser_FromClauseParser.fromClause_return fromClause() throws RecognitionException { return gFromClauseParser.fromClause(); }\n+\tpublic HiveParser_SelectClauseParser.selectExpressionList_return selectExpressionList() throws RecognitionException { return gSelectClauseParser.selectExpressionList(); }\n \n-\tpublic HiveParser_CreateDDLParser.likeTableOrFile_return likeTableOrFile() throws RecognitionException { return gCreateDDLParser.likeTableOrFile(); }\n+\tpublic HiveParser_IdentifiersParser.caseExpression_return caseExpression() throws RecognitionException { return gIdentifiersParser.caseExpression(); }\n \n-\tpublic HiveParser_FromClauseParser.viewName_return viewName() throws RecognitionException { return gFromClauseParser.viewName(); }\n+\tpublic HiveParser_FromClauseParser.expressionList_return expressionList() throws RecognitionException { return gFromClauseParser.expressionList(); }\n \n-\tpublic HiveParser_FromClauseParser.tableAllColumns_return tableAllColumns() throws RecognitionException { return gFromClauseParser.tableAllColumns(); }\n+\tpublic HiveParser_IdentifiersParser.parameterIdx_return parameterIdx() throws RecognitionException { return gIdentifiersParser.parameterIdx(); }\n \n-\tpublic HiveParser_IdentifiersParser.intervalExpression_return intervalExpression() throws RecognitionException { return gIdentifiersParser.intervalExpression(); }\n+\tpublic HiveParser_CreateDDLParser.dcProperties_return dcProperties() throws RecognitionException { return gCreateDDLParser.dcProperties(); }\n \n-\tpublic HiveParser_IdentifiersParser.havingCondition_return havingCondition() throws RecognitionException { return gIdentifiersParser.havingCondition(); }\n+\tpublic HiveParser_AlterClauseParser.alterStatementSuffixAddPartitionsElement_return alterStatementSuffixAddPartitionsElement() throws RecognitionException { return gAlterClauseParser.alterStatementSuffixAddPartitionsElement(); }\n \n-\tpublic HiveParser_ResourcePlanParser.dropResourcePlanStatement_return dropResourcePlanStatement() throws RecognitionException { return gResourcePlanParser.dropResourcePlanStatement(); }\n+\tpublic HiveParser_AlterClauseParser.alterStatementSuffixLocation_return alterStatementSuffixLocation(boolean partition) throws RecognitionException { return gAlterClauseParser.alterStatementSuffixLocation(partition); }\n \n-\tpublic HiveParser_CreateDDLParser.dcProperties_return dcProperties() throws RecognitionException { return gCreateDDLParser.dcProperties(); }\n+\tpublic HiveParser_SelectClauseParser.trfmClause_return trfmClause() throws RecognitionException { return gSelectClauseParser.trfmClause(); }\n \n-\tpublic HiveParser_CreateDDLParser.dataConnectorUrl_return dataConnectorUrl() throws RecognitionException { return gCreateDDLParser.dataConnectorUrl(); }\n+\tpublic HiveParser_IdentifiersParser.charSetStringLiteral_return charSetStringLiteral() throws RecognitionException { return gIdentifiersParser.charSetStringLiteral(); }\n \n-\tpublic HiveParser_ResourcePlanParser.globalWmStatement_return globalWmStatement() throws RecognitionException { return gResourcePlanParser.globalWmStatement(); }\n+\tpublic HiveParser_ResourcePlanParser.unmanaged_return unmanaged() throws RecognitionException { return gResourcePlanParser.unmanaged(); }\n \n-\tpublic HiveParser_IdentifiersParser.trimFunction_return trimFunction() throws RecognitionException { return gIdentifiersParser.trimFunction(); }\n+\tpublic HiveParser_SelectClauseParser.window_defn_return window_defn() throws RecognitionException { return gSelectClauseParser.window_defn(); }\n \n-\tpublic HiveParser_ResourcePlanParser.triggerAndExpression_return triggerAndExpression() throws RecognitionException { return gResourcePlanParser.triggerAndExpression(); }\n+\tpublic HiveParser_IdentifiersParser.principalIdentifier_return principalIdentifier() throws RecognitionException { return gIdentifiersParser.principalIdentifier(); }\n \n-\tpublic HiveParser_IdentifiersParser.precedenceAmpersandExpression_return precedenceAmpersandExpression() throws RecognitionException { return gIdentifiersParser.precedenceAmpersandExpression(); }\n+\tpublic HiveParser_AlterClauseParser.alterStatementSuffixRenamePart_return alterStatementSuffixRenamePart() throws RecognitionException { return gAlterClauseParser.alterStatementSuffixRenamePart(); }\n \n-\tpublic HiveParser_IdentifiersParser.precedenceUnaryOperator_return precedenceUnaryOperator() throws RecognitionException { return gIdentifiersParser.precedenceUnaryOperator(); }\n+\tpublic HiveParser_AlterClauseParser.alterStatementSuffixAddConstraint_return alterStatementSuffixAddConstraint() throws RecognitionException { return gAlterClauseParser.alterStatementSuffixAddConstraint(); }\n \n-\tpublic HiveParser_ResourcePlanParser.triggerActionExpression_return triggerActionExpression() throws RecognitionException { return gResourcePlanParser.triggerActionExpression(); }\n+\tpublic HiveParser_IdentifiersParser.dateLiteral_return dateLiteral() throws RecognitionException { return gIdentifiersParser.dateLiteral(); }\n \n-\tpublic HiveParser_AlterClauseParser.alterStatementSuffixUpdateStats_return alterStatementSuffixUpdateStats(boolean partition) throws RecognitionException { return gAlterClauseParser.alterStatementSuffixUpdateStats(partition); }\n+\tpublic HiveParser_SelectClauseParser.window_specification_return window_specification(CommonTree nullTreatment) throws RecognitionException { return gSelectClauseParser.window_specification(nullTreatment); }\n \n-\tpublic HiveParser_FromClauseParser.joinSource_return joinSource() throws RecognitionException { return gFromClauseParser.joinSource(); }\n+\tpublic HiveParser_IdentifiersParser.precedenceSimilarExpression_return precedenceSimilarExpression() throws RecognitionException { return gIdentifiersParser.precedenceSimilarExpression(); }\n \n-\tpublic HiveParser_ResourcePlanParser.triggerExpressionStandalone_return triggerExpressionStandalone() throws RecognitionException { return gResourcePlanParser.triggerExpressionStandalone(); }\n+\tpublic HiveParser_IdentifiersParser.precedenceNotOperator_return precedenceNotOperator() throws RecognitionException { return gIdentifiersParser.precedenceNotOperator(); }\n \n-\tpublic HiveParser_SelectClauseParser.selectClause_return selectClause() throws RecognitionException { return gSelectClauseParser.selectClause(); }\n+\tpublic HiveParser_FromClauseParser.virtualTableSource_return virtualTableSource() throws RecognitionException { return gFromClauseParser.virtualTableSource(); }\n \n-\tpublic HiveParser_IdentifiersParser.charSetStringLiteral_return charSetStringLiteral() throws RecognitionException { return gIdentifiersParser.charSetStringLiteral(); }\n+\tpublic HiveParser_IdentifiersParser.tableOrPartition_return tableOrPartition() throws RecognitionException { return gIdentifiersParser.tableOrPartition(); }\n \n-\tpublic HiveParser_AlterClauseParser.alterStatementSuffixUnArchive_return alterStatementSuffixUnArchive() throws RecognitionException { return gAlterClauseParser.alterStatementSuffixUnArchive(); }\n+\tpublic HiveParser_AlterClauseParser.alterStatementSuffixCreateBranch_return alterStatementSuffixCreateBranch() throws RecognitionException { return gAlterClauseParser.alterStatementSuffixCreateBranch(); }\n \n-\tpublic HiveParser_IdentifiersParser.precedenceAndExpression_return precedenceAndExpression() throws RecognitionException { return gIdentifiersParser.precedenceAndExpression(); }\n+\tpublic HiveParser_AlterClauseParser.alterStatementSuffixArchive_return alterStatementSuffixArchive() throws RecognitionException { return gAlterClauseParser.alterStatementSuffixArchive(); }\n \n-\tpublic HiveParser_FromClauseParser.valuesTableConstructor_return valuesTableConstructor() throws RecognitionException { return gFromClauseParser.valuesTableConstructor(); }\n+\tpublic HiveParser_ResourcePlanParser.triggerLiteral_return triggerLiteral() throws RecognitionException { return gResourcePlanParser.triggerLiteral(); }\n \n-\tpublic HiveParser_IdentifiersParser.precedencePlusOperator_return precedencePlusOperator() throws RecognitionException { return gIdentifiersParser.precedencePlusOperator(); }\n+\tpublic HiveParser_ResourcePlanParser.rpUnassignList_return rpUnassignList() throws RecognitionException { return gResourcePlanParser.rpUnassignList(); }\n \n-\tpublic HiveParser_IdentifiersParser.groupingExpressionSingle_return groupingExpressionSingle() throws RecognitionException { return gIdentifiersParser.groupingExpressionSingle(); }\n+\tpublic HiveParser_ResourcePlanParser.createMappingStatement_return createMappingStatement() throws RecognitionException { return gResourcePlanParser.createMappingStatement(); }\n \n-\tpublic HiveParser_AlterClauseParser.alterStatementSuffixDropPartitions_return alterStatementSuffixDropPartitions(boolean table) throws RecognitionException { return gAlterClauseParser.alterStatementSuffixDropPartitions(table); }\n+\tpublic HiveParser_ResourcePlanParser.dropTriggerStatement_return dropTriggerStatement() throws RecognitionException { return gResourcePlanParser.dropTriggerStatement(); }\n \n-\tpublic HiveParser_AlterClauseParser.alterMaterializedViewStatementSuffix_return alterMaterializedViewStatementSuffix(CommonTree tableNameTree) throws RecognitionException { return gAlterClauseParser.alterMaterializedViewStatementSuffix(tableNameTree); }\n+\tpublic HiveParser_SelectClauseParser.selectTrfmClause_return selectTrfmClause() throws RecognitionException { return gSelectClauseParser.selectTrfmClause(); }\n \n-\tpublic HiveParser_AlterClauseParser.alterStatementPartitionKeyType_return alterStatementPartitionKeyType() throws RecognitionException { return gAlterClauseParser.alterStatementPartitionKeyType(); }\n+\tpublic HiveParser_FromClauseParser.partitionTableFunctionSource_return partitionTableFunctionSource() throws RecognitionException { return gFromClauseParser.partitionTableFunctionSource(); }\n \n-\tpublic HiveParser_FromClauseParser.tableSample_return tableSample() throws RecognitionException { return gFromClauseParser.tableSample(); }\n+\tpublic HiveParser_ResourcePlanParser.triggerExpression_return triggerExpression() throws RecognitionException { return gResourcePlanParser.triggerExpression(); }\n \n-\tpublic HiveParser_AlterClauseParser.alterStatementSuffixConvert_return alterStatementSuffixConvert() throws RecognitionException { return gAlterClauseParser.alterStatementSuffixConvert(); }\n+\tpublic HiveParser_FromClauseParser.firstValueRowConstructor_return firstValueRowConstructor() throws RecognitionException { return gFromClauseParser.firstValueRowConstructor(); }\n \n-\tpublic HiveParser_FromClauseParser.partitionedTableFunction_return partitionedTableFunction() throws RecognitionException { return gFromClauseParser.partitionedTableFunction(); }\n+\tpublic HiveParser_AlterClauseParser.alterViewStatementSuffix_return alterViewStatementSuffix() throws RecognitionException { return gAlterClauseParser.alterViewStatementSuffix(); }\n \n-\tpublic HiveParser_ResourcePlanParser.activate_return activate() throws RecognitionException { return gResourcePlanParser.activate(); }\n+\tpublic HiveParser_IdentifiersParser.precedenceOrExpression_return precedenceOrExpression() throws RecognitionException { return gIdentifiersParser.precedenceOrExpression(); }\n \n-\tpublic HiveParser_AlterClauseParser.alterDataConnectorSuffixSetUrl_return alterDataConnectorSuffixSetUrl() throws RecognitionException { return gAlterClauseParser.alterDataConnectorSuffixSetUrl(); }\n+\tpublic HiveParser_ResourcePlanParser.triggerActionExpression_return triggerActionExpression() throws RecognitionException { return gResourcePlanParser.triggerActionExpression(); }\n \n-\tpublic HiveParser_SelectClauseParser.window_defn_return window_defn() throws RecognitionException { return gSelectClauseParser.window_defn(); }\n+\tpublic HiveParser_PrepareStatementParser.executeStatement_return executeStatement() throws RecognitionException { return gPrepareStatementParser.executeStatement(); }\n \n-\tpublic HiveParser_PrepareStatementParser.prepareStatement_return prepareStatement() throws RecognitionException { return gPrepareStatementParser.prepareStatement(); }\n+\tpublic HiveParser_IdentifiersParser.groupByEmpty_return groupByEmpty() throws RecognitionException { return gIdentifiersParser.groupByEmpty(); }\n \n-\tpublic HiveParser_SelectClauseParser.window_frame_return window_frame() throws RecognitionException { return gSelectClauseParser.window_frame(); }\n+\tpublic HiveParser_IdentifiersParser.function_return function() throws RecognitionException { return gIdentifiersParser.function(); }\n \n-\tpublic HiveParser_IdentifiersParser.precedenceRegexpOperator_return precedenceRegexpOperator() throws RecognitionException { return gIdentifiersParser.precedenceRegexpOperator(); }\n+\tpublic HiveParser_IdentifiersParser.precedencePlusOperator_return precedencePlusOperator() throws RecognitionException { return gIdentifiersParser.precedencePlusOperator(); }\n \n-\tpublic HiveParser_FromClauseParser.asOfClause_return asOfClause() throws RecognitionException { return gFromClauseParser.asOfClause(); }\n+\tpublic HiveParser_FromClauseParser.valuesSource_return valuesSource() throws RecognitionException { return gFromClauseParser.valuesSource(); }\n \n-\tpublic HiveParser_IdentifiersParser.groupingSetExpression_return groupingSetExpression() throws RecognitionException { return gIdentifiersParser.groupingSetExpression(); }\n+\tpublic HiveParser_AlterClauseParser.alterDataConnectorSuffixProperties_return alterDataConnectorSuffixProperties() throws RecognitionException { return gAlterClauseParser.alterDataConnectorSuffixProperties(); }\n \n-\tpublic HiveParser_AlterClauseParser.alterStatementSuffixRenamePart_return alterStatementSuffixRenamePart() throws RecognitionException { return gAlterClauseParser.alterStatementSuffixRenamePart(); }\n+\tpublic HiveParser_IdentifiersParser.booleanValueTok_return booleanValueTok() throws RecognitionException { return gIdentifiersParser.booleanValueTok(); }\n \n-\tpublic HiveParser_SelectClauseParser.selectList_return selectList() throws RecognitionException { return gSelectClauseParser.selectList(); }\n+\tpublic HiveParser_IdentifiersParser.expressionPart_return expressionPart(CommonTree firstExprTree, boolean isStruct) throws RecognitionException { return gIdentifiersParser.expressionPart(firstExprTree, isStruct); }\n \n-\tpublic HiveParser_ResourcePlanParser.dropMappingStatement_return dropMappingStatement() throws RecognitionException { return gResourcePlanParser.dropMappingStatement(); }\n+\tpublic HiveParser_IdentifiersParser.booleanValue_return booleanValue() throws RecognitionException { return gIdentifiersParser.booleanValue(); }\n \n-\tpublic HiveParser_IdentifiersParser.tableOrPartition_return tableOrPartition() throws RecognitionException { return gIdentifiersParser.tableOrPartition(); }\n+\tpublic HiveParser_CreateDDLParser.likeTableOrFile_return likeTableOrFile() throws RecognitionException { return gCreateDDLParser.likeTableOrFile(); }\n \n-\tpublic HiveParser_IdentifiersParser.descFuncNames_return descFuncNames() throws RecognitionException { return gIdentifiersParser.descFuncNames(); }\n+\tpublic HiveParser_IdentifiersParser.orderByClause_return orderByClause() throws RecognitionException { return gIdentifiersParser.orderByClause(); }\n \n-\tpublic HiveParser_ResourcePlanParser.comparisionOperator_return comparisionOperator() throws RecognitionException { return gResourcePlanParser.comparisionOperator(); }\n+\tpublic HiveParser_IdentifiersParser.precedenceFieldExpression_return precedenceFieldExpression() throws RecognitionException { return gIdentifiersParser.precedenceFieldExpression(); }\n \n-\tpublic HiveParser_IdentifiersParser.precedenceNotExpression_return precedenceNotExpression() throws RecognitionException { return gIdentifiersParser.precedenceNotExpression(); }\n+\tpublic HiveParser_IdentifiersParser.expressionOrDefault_return expressionOrDefault() throws RecognitionException { return gIdentifiersParser.expressionOrDefault(); }\n \n-\tpublic HiveParser_IdentifiersParser.function_return function() throws RecognitionException { return gIdentifiersParser.function(); }\n+\tpublic HiveParser_AlterClauseParser.alterStatementSuffixClusterbySortby_return alterStatementSuffixClusterbySortby() throws RecognitionException { return gAlterClauseParser.alterStatementSuffixClusterbySortby(); }\n \n-\tpublic HiveParser_FromClauseParser.virtualTableSource_return virtualTableSource() throws RecognitionException { return gFromClauseParser.virtualTableSource(); }\n+\tpublic HiveParser_IdentifiersParser.precedenceConcatenateOperator_return precedenceConcatenateOperator() throws RecognitionException { return gIdentifiersParser.precedenceConcatenateOperator(); }\n \n-\tpublic HiveParser_AlterClauseParser.alterTableStatementSuffix_return alterTableStatementSuffix() throws RecognitionException { return gAlterClauseParser.alterTableStatementSuffix(); }\n+\tpublic HiveParser_FromClauseParser.subQuerySource_return subQuerySource() throws RecognitionException { return gFromClauseParser.subQuerySource(); }\n \n-\tpublic HiveParser_ResourcePlanParser.poolPath_return poolPath() throws RecognitionException { return gResourcePlanParser.poolPath(); }\n+\tpublic HiveParser_AlterClauseParser.alterStatementSuffixUpdateColumns_return alterStatementSuffixUpdateColumns() throws RecognitionException { return gAlterClauseParser.alterStatementSuffixUpdateColumns(); }\n \n-\tpublic HiveParser_IdentifiersParser.stringLiteralSequence_return stringLiteralSequence() throws RecognitionException { return gIdentifiersParser.stringLiteralSequence(); }\n+\tpublic HiveParser_ResourcePlanParser.poolAssignList_return poolAssignList() throws RecognitionException { return gResourcePlanParser.poolAssignList(); }\n \n-\tpublic HiveParser_IdentifiersParser.precedenceFieldExpression_return precedenceFieldExpression() throws RecognitionException { return gIdentifiersParser.precedenceFieldExpression(); }\n+\tpublic HiveParser_IdentifiersParser.identifier_return identifier() throws RecognitionException { return gIdentifiersParser.identifier(); }\n \n-\tpublic HiveParser_AlterClauseParser.blocking_return blocking() throws RecognitionException { return gAlterClauseParser.blocking(); }\n+\tpublic HiveParser_FromClauseParser.atomjoinSource_return atomjoinSource() throws RecognitionException { return gFromClauseParser.atomjoinSource(); }\n \n-\tpublic HiveParser_IdentifiersParser.partitionSelectorOperator_return partitionSelectorOperator() throws RecognitionException { return gIdentifiersParser.partitionSelectorOperator(); }\n+\tpublic HiveParser_IdentifiersParser.groupingSetExpression_return groupingSetExpression() throws RecognitionException { return gIdentifiersParser.groupingSetExpression(); }\n \n-\tpublic HiveParser_AlterClauseParser.alterStatementSuffixExchangePartition_return alterStatementSuffixExchangePartition() throws RecognitionException { return gAlterClauseParser.alterStatementSuffixExchangePartition(); }\n+\tpublic HiveParser_FromClauseParser.splitSample_return splitSample() throws RecognitionException { return gFromClauseParser.splitSample(); }\n \n-\tpublic HiveParser_AlterClauseParser.alterStatement_return alterStatement() throws RecognitionException { return gAlterClauseParser.alterStatement(); }\n+\tpublic HiveParser_SelectClauseParser.window_clause_return window_clause() throws RecognitionException { return gSelectClauseParser.window_clause(); }\n \n-\tpublic HiveParser_AlterClauseParser.alterStatementSuffixFileFormat_return alterStatementSuffixFileFormat(boolean partition) throws RecognitionException { return gAlterClauseParser.alterStatementSuffixFileFormat(partition); }\n+\tpublic HiveParser_SelectClauseParser.selectList_return selectList() throws RecognitionException { return gSelectClauseParser.selectList(); }\n \n-\tpublic HiveParser_PrepareStatementParser.executeStatement_return executeStatement() throws RecognitionException { return gPrepareStatementParser.executeStatement(); }\n+\tpublic HiveParser_CreateDDLParser.dataConnectorType_return dataConnectorType() throws RecognitionException { return gCreateDDLParser.dataConnectorType(); }\n \n-\tpublic HiveParser_ResourcePlanParser.poolAssign_return poolAssign() throws RecognitionException { return gResourcePlanParser.poolAssign(); }\n+\tpublic HiveParser_AlterClauseParser.alterStatementSuffixUpdateStats_return alterStatementSuffixUpdateStats(boolean partition) throws RecognitionException { return gAlterClauseParser.alterStatementSuffixUpdateStats(partition); }\n \n-\tpublic HiveParser_SelectClauseParser.selectExpressionList_return selectExpressionList() throws RecognitionException { return gSelectClauseParser.selectExpressionList(); }\n+\tpublic HiveParser_AlterClauseParser.alterDataConnectorSuffixSetOwner_return alterDataConnectorSuffixSetOwner() throws RecognitionException { return gAlterClauseParser.alterDataConnectorSuffixSetOwner(); }\n \n-\tpublic HiveParser_AlterClauseParser.retentionOfSnapshots_return retentionOfSnapshots() throws RecognitionException { return gAlterClauseParser.retentionOfSnapshots(); }\n+\tpublic HiveParser_IdentifiersParser.precedenceBitwiseOrExpression_return precedenceBitwiseOrExpression() throws RecognitionException { return gIdentifiersParser.precedenceBitwiseOrExpression(); }\n \n-\tpublic HiveParser_FromClauseParser.tableAlias_return tableAlias() throws RecognitionException { return gFromClauseParser.tableAlias(); }\n+\tpublic HiveParser_ResourcePlanParser.alterMappingStatement_return alterMappingStatement() throws RecognitionException { return gResourcePlanParser.alterMappingStatement(); }\n \n-\tpublic HiveParser_IdentifiersParser.precedenceEqualOperator_return precedenceEqualOperator() throws RecognitionException { return gIdentifiersParser.precedenceEqualOperator(); }\n+\tpublic HiveParser_IdentifiersParser.partitionVal_return partitionVal() throws RecognitionException { return gIdentifiersParser.partitionVal(); }\n+\n+\tpublic HiveParser_ResourcePlanParser.globalWmStatement_return globalWmStatement() throws RecognitionException { return gResourcePlanParser.globalWmStatement(); }\n \n \tpublic HiveParser_ResourcePlanParser.createTriggerStatement_return createTriggerStatement() throws RecognitionException { return gResourcePlanParser.createTriggerStatement(); }\n \n-\tpublic HiveParser_ResourcePlanParser.rpAssign_return rpAssign() throws RecognitionException { return gResourcePlanParser.rpAssign(); }\n+\tpublic HiveParser_FromClauseParser.joinSourcePart_return joinSourcePart() throws RecognitionException { return gFromClauseParser.joinSourcePart(); }\n \n-\tpublic HiveParser_IdentifiersParser.clusterByClause_return clusterByClause() throws RecognitionException { return gIdentifiersParser.clusterByClause(); }\n+\tpublic HiveParser_AlterClauseParser.alterStatementSuffixBucketNum_return alterStatementSuffixBucketNum(boolean partition) throws RecognitionException { return gAlterClauseParser.alterStatementSuffixBucketNum(partition); }\n \n-\tpublic HiveParser_IdentifiersParser.partitionSelectorVal_return partitionSelectorVal() throws RecognitionException { return gIdentifiersParser.partitionSelectorVal(); }\n+\tpublic HiveParser_ResourcePlanParser.poolAssign_return poolAssign() throws RecognitionException { return gResourcePlanParser.poolAssign(); }\n \n-\tpublic HiveParser_IdentifiersParser.groupby_expression_return groupby_expression() throws RecognitionException { return gIdentifiersParser.groupby_expression(); }\n+\tpublic HiveParser_IdentifiersParser.havingCondition_return havingCondition() throws RecognitionException { return gIdentifiersParser.havingCondition(); }\n \n-\tpublic HiveParser_IdentifiersParser.expressionsNotInParenthesis_return expressionsNotInParenthesis(boolean isStruct, boolean forceStruct) throws RecognitionException { return gIdentifiersParser.expressionsNotInParenthesis(isStruct, forceStruct); }\n+\tpublic HiveParser_IdentifiersParser.nonReserved_return nonReserved() throws RecognitionException { return gIdentifiersParser.nonReserved(); }\n \n-\tpublic HiveParser_ResourcePlanParser.dropTriggerStatement_return dropTriggerStatement() throws RecognitionException { return gResourcePlanParser.dropTriggerStatement(); }\n+\tpublic HiveParser_IdentifiersParser.havingClause_return havingClause() throws RecognitionException { return gIdentifiersParser.havingClause(); }\n \n-\tpublic HiveParser_FromClauseParser.whereClause_return whereClause() throws RecognitionException { return gFromClauseParser.whereClause(); }\n+\tpublic HiveParser_AlterClauseParser.alterStatementSuffixStatsPart_return alterStatementSuffixStatsPart() throws RecognitionException { return gAlterClauseParser.alterStatementSuffixStatsPart(); }\n \n-\tpublic HiveParser_IdentifiersParser.null_treatment_return null_treatment() throws RecognitionException { return gIdentifiersParser.null_treatment(); }\n+\tpublic HiveParser_IdentifiersParser.expressionsInParenthesis_return expressionsInParenthesis(boolean isStruct, boolean forceStruct) throws RecognitionException { return gIdentifiersParser.expressionsInParenthesis(isStruct, forceStruct); }\n \n-\tpublic HiveParser_AlterClauseParser.alterDatabaseStatementSuffix_return alterDatabaseStatementSuffix() throws RecognitionException { return gAlterClauseParser.alterDatabaseStatementSuffix(); }\n+\tpublic HiveParser_FromClauseParser.whereClause_return whereClause() throws RecognitionException { return gFromClauseParser.whereClause(); }\n \n-\tpublic HiveParser_AlterClauseParser.tablePartitionPrefix_return tablePartitionPrefix() throws RecognitionException { return gAlterClauseParser.tablePartitionPrefix(); }\n+\tpublic HiveParser_AlterClauseParser.alterStatementSuffixFileFormat_return alterStatementSuffixFileFormat(boolean partition) throws RecognitionException { return gAlterClauseParser.alterStatementSuffixFileFormat(partition); }\n \n-\tpublic HiveParser_AlterClauseParser.alterStatementSuffixStatsPart_return alterStatementSuffixStatsPart() throws RecognitionException { return gAlterClauseParser.alterStatementSuffixStatsPart(); }\n+\tpublic HiveParser_IdentifiersParser.precedenceStarExpression_return precedenceStarExpression() throws RecognitionException { return gIdentifiersParser.precedenceStarExpression(); }\n \n-\tpublic HiveParser_IdentifiersParser.precedenceOrOperator_return precedenceOrOperator() throws RecognitionException { return gIdentifiersParser.precedenceOrOperator(); }\n+\tpublic HiveParser_ResourcePlanParser.triggerAndExpression_return triggerAndExpression() throws RecognitionException { return gResourcePlanParser.triggerAndExpression(); }\n \n-\tpublic HiveParser_FromClauseParser.splitSample_return splitSample() throws RecognitionException { return gFromClauseParser.splitSample(); }\n+\tpublic HiveParser_FromClauseParser.fromSource_return fromSource() throws RecognitionException { return gFromClauseParser.fromSource(); }\n \n-\tpublic HiveParser_AlterClauseParser.alterMaterializedViewSuffixRebuild_return alterMaterializedViewSuffixRebuild(CommonTree tableNameTree) throws RecognitionException { return gAlterClauseParser.alterMaterializedViewSuffixRebuild(tableNameTree); }\n+\tpublic HiveParser_IdentifiersParser.columnRefOrderInParenthesis_return columnRefOrderInParenthesis() throws RecognitionException { return gIdentifiersParser.columnRefOrderInParenthesis(); }\n \n-\tpublic HiveParser_IdentifiersParser.precedenceConcatenateOperator_return precedenceConcatenateOperator() throws RecognitionException { return gIdentifiersParser.precedenceConcatenateOperator(); }\n+\tpublic HiveParser_IdentifiersParser.precedenceSimilarExpressionPart_return precedenceSimilarExpressionPart(CommonTree t) throws RecognitionException { return gIdentifiersParser.precedenceSimilarExpressionPart(t); }\n \n-\tpublic HiveParser_FromClauseParser.uniqueJoinToken_return uniqueJoinToken() throws RecognitionException { return gFromClauseParser.uniqueJoinToken(); }\n+\tpublic HiveParser_ResourcePlanParser.disable_return disable() throws RecognitionException { return gResourcePlanParser.disable(); }\n \n-\tpublic HiveParser_AlterClauseParser.alterDataConnectorStatementSuffix_return alterDataConnectorStatementSuffix() throws RecognitionException { return gAlterClauseParser.alterDataConnectorStatementSuffix(); }\n+\tpublic HiveParser_AlterClauseParser.alterStatementChangeColPosition_return alterStatementChangeColPosition() throws RecognitionException { return gAlterClauseParser.alterStatementChangeColPosition(); }\n \n-\tpublic HiveParser_IdentifiersParser.floorExpression_return floorExpression() throws RecognitionException { return gIdentifiersParser.floorExpression(); }\n+\tpublic HiveParser_IdentifiersParser.precedencePlusExpression_return precedencePlusExpression() throws RecognitionException { return gIdentifiersParser.precedencePlusExpression(); }\n \n-\tpublic HiveParser_IdentifiersParser.firstExpressionsWithAlias_return firstExpressionsWithAlias() throws RecognitionException { return gIdentifiersParser.firstExpressionsWithAlias(); }\n+\tpublic HiveParser_ResourcePlanParser.triggerAtomExpression_return triggerAtomExpression() throws RecognitionException { return gResourcePlanParser.triggerAtomExpression(); }\n \n-\tpublic HiveParser_IdentifiersParser.qualifyClause_return qualifyClause() throws RecognitionException { return gIdentifiersParser.qualifyClause(); }\n+\tpublic HiveParser_IdentifiersParser.precedenceNotExpression_return precedenceNotExpression() throws RecognitionException { return gIdentifiersParser.precedenceNotExpression(); }\n \n-\tpublic HiveParser_SelectClauseParser.trfmClause_return trfmClause() throws RecognitionException { return gSelectClauseParser.trfmClause(); }\n+\tpublic HiveParser_IdentifiersParser.floorExpression_return floorExpression() throws RecognitionException { return gIdentifiersParser.floorExpression(); }\n \n-\tpublic HiveParser_ResourcePlanParser.resourcePlanDdlStatements_return resourcePlanDdlStatements() throws RecognitionException { return gResourcePlanParser.resourcePlanDdlStatements(); }\n+\tpublic HiveParser_IdentifiersParser.sysFuncNames_return sysFuncNames() throws RecognitionException { return gIdentifiersParser.sysFuncNames(); }\n \n-\tpublic HiveParser_CreateDDLParser.dataConnectorType_return dataConnectorType() throws RecognitionException { return gCreateDDLParser.dataConnectorType(); }\n+\tpublic HiveParser_IdentifiersParser.expressionWithAlias_return expressionWithAlias() throws RecognitionException { return gIdentifiersParser.expressionWithAlias(); }\n \n-\tpublic HiveParser_IdentifiersParser.havingClause_return havingClause() throws RecognitionException { return gIdentifiersParser.havingClause(); }\n+\tpublic HiveParser_IdentifiersParser.partitionSelectorSpec_return partitionSelectorSpec() throws RecognitionException { return gIdentifiersParser.partitionSelectorSpec(); }\n \n-\tpublic HiveParser_SelectClauseParser.selectTrfmClause_return selectTrfmClause() throws RecognitionException { return gSelectClauseParser.selectTrfmClause(); }\n+\tpublic HiveParser_CreateDDLParser.dataConnectorUrl_return dataConnectorUrl() throws RecognitionException { return gCreateDDLParser.dataConnectorUrl(); }\n \n-\tpublic HiveParser_IdentifiersParser.precedenceConcatenateExpression_return precedenceConcatenateExpression() throws RecognitionException { return gIdentifiersParser.precedenceConcatenateExpression(); }\n+\tpublic HiveParser_IdentifiersParser.constant_return constant() throws RecognitionException { return gIdentifiersParser.constant(); }\n \n-\tpublic HiveParser_IdentifiersParser.precedenceBitwiseXorOperator_return precedenceBitwiseXorOperator() throws RecognitionException { return gIdentifiersParser.precedenceBitwiseXorOperator(); }\n+\tpublic HiveParser_FromClauseParser.valuesClause_return valuesClause() throws RecognitionException { return gFromClauseParser.valuesClause(); }\n \n-\tpublic HiveParser_IdentifiersParser.expressionOrDefault_return expressionOrDefault() throws RecognitionException { return gIdentifiersParser.expressionOrDefault(); }\n+\tpublic HiveParser_FromClauseParser.tableSource_return tableSource() throws RecognitionException { return gFromClauseParser.tableSource(); }\n \n-\tpublic HiveParser_ResourcePlanParser.triggerExpression_return triggerExpression() throws RecognitionException { return gResourcePlanParser.triggerExpression(); }\n+\tpublic HiveParser_IdentifiersParser.precedenceStarOperator_return precedenceStarOperator() throws RecognitionException { return gIdentifiersParser.precedenceStarOperator(); }\n \n-\tpublic HiveParser_FromClauseParser.firstValueRowConstructor_return firstValueRowConstructor() throws RecognitionException { return gFromClauseParser.firstValueRowConstructor(); }\n+\tpublic HiveParser_ResourcePlanParser.triggerOrExpression_return triggerOrExpression() throws RecognitionException { return gResourcePlanParser.triggerOrExpression(); }\n \n-\tpublic HiveParser_AlterClauseParser.alterStatementSuffixProperties_return alterStatementSuffixProperties() throws RecognitionException { return gAlterClauseParser.alterStatementSuffixProperties(); }\n+\tpublic HiveParser_FromClauseParser.tableSample_return tableSample() throws RecognitionException { return gFromClauseParser.tableSample(); }\n \n-\tpublic HiveParser_FromClauseParser.aliasList_return aliasList() throws RecognitionException { return gFromClauseParser.aliasList(); }\n+\tpublic HiveParser_IdentifiersParser.expressions_return expressions() throws RecognitionException { return gIdentifiersParser.expressions(); }\n \n-\tpublic HiveParser_IdentifiersParser.whenExpression_return whenExpression() throws RecognitionException { return gIdentifiersParser.whenExpression(); }\n+\tpublic HiveParser_IdentifiersParser.precedenceAndExpression_return precedenceAndExpression() throws RecognitionException { return gIdentifiersParser.precedenceAndExpression(); }\n \n-\tpublic HiveParser_AlterClauseParser.refRetain_return refRetain() throws RecognitionException { return gAlterClauseParser.refRetain(); }\n+\tpublic HiveParser_ResourcePlanParser.rpAssignList_return rpAssignList() throws RecognitionException { return gResourcePlanParser.rpAssignList(); }\n \n-\tpublic HiveParser_SelectClauseParser.window_frame_boundary_return window_frame_boundary() throws RecognitionException { return gSelectClauseParser.window_frame_boundary(); }\n+\tpublic HiveParser_AlterClauseParser.alterDataConnectorSuffixSetUrl_return alterDataConnectorSuffixSetUrl() throws RecognitionException { return gAlterClauseParser.alterDataConnectorSuffixSetUrl(); }\n \n-\tpublic HiveParser_AlterClauseParser.alterStatementSuffixSerdeProperties_return alterStatementSuffixSerdeProperties(boolean partition) throws RecognitionException { return gAlterClauseParser.alterStatementSuffixSerdeProperties(partition); }\n+\tpublic HiveParser_FromClauseParser.uniqueJoinToken_return uniqueJoinToken() throws RecognitionException { return gFromClauseParser.uniqueJoinToken(); }\n \n-\tpublic HiveParser_FromClauseParser.partitionTableFunctionSource_return partitionTableFunctionSource() throws RecognitionException { return gFromClauseParser.partitionTableFunctionSource(); }\n+\tpublic HiveParser_AlterClauseParser.alterTableStatementSuffix_return alterTableStatementSuffix() throws RecognitionException { return gAlterClauseParser.alterTableStatementSuffix(); }\n \n-\tpublic HiveParser_AlterClauseParser.compactPool_return compactPool() throws RecognitionException { return gAlterClauseParser.compactPool(); }\n+\tpublic HiveParser_CreateDDLParser.dataConnectorComment_return dataConnectorComment() throws RecognitionException { return gCreateDDLParser.dataConnectorComment(); }\n \n-\tpublic HiveParser_AlterClauseParser.alterDatabaseSuffixProperties_return alterDatabaseSuffixProperties() throws RecognitionException { return gAlterClauseParser.alterDatabaseSuffixProperties(); }\n+\tpublic HiveParser_IdentifiersParser.precedenceAmpersandOperator_return precedenceAmpersandOperator() throws RecognitionException { return gIdentifiersParser.precedenceAmpersandOperator(); }\n \n-\tpublic HiveParser_AlterClauseParser.alterDataConnectorSuffixProperties_return alterDataConnectorSuffixProperties() throws RecognitionException { return gAlterClauseParser.alterDataConnectorSuffixProperties(); }\n+\tpublic HiveParser_ResourcePlanParser.alterTriggerStatement_return alterTriggerStatement() throws RecognitionException { return gResourcePlanParser.alterTriggerStatement(); }\n \n-\tpublic HiveParser_IdentifiersParser.timeQualifiers_return timeQualifiers() throws RecognitionException { return gIdentifiersParser.timeQualifiers(); }\n+\tpublic HiveParser_IdentifiersParser.precedenceSimilarExpressionPartNot_return precedenceSimilarExpressionPartNot(CommonTree t) throws RecognitionException { return gIdentifiersParser.precedenceSimilarExpressionPartNot(t); }\n \n-\tpublic HiveParser_ResourcePlanParser.rpUnassignList_return rpUnassignList() throws RecognitionException { return gResourcePlanParser.rpUnassignList(); }\n+\tpublic HiveParser_AlterClauseParser.alterStatementSuffixDropBranch_return alterStatementSuffixDropBranch() throws RecognitionException { return gAlterClauseParser.alterStatementSuffixDropBranch(); }\n \n-\tpublic HiveParser_AlterClauseParser.alterDatabaseSuffixSetOwner_return alterDatabaseSuffixSetOwner() throws RecognitionException { return gAlterClauseParser.alterDatabaseSuffixSetOwner(); }\n+\tpublic HiveParser_IdentifiersParser.intervalValue_return intervalValue() throws RecognitionException { return gIdentifiersParser.intervalValue(); }\n \n-\tpublic HiveParser_CreateDDLParser.createDataConnectorStatement_return createDataConnectorStatement() throws RecognitionException { return gCreateDDLParser.createDataConnectorStatement(); }\n+\tpublic HiveParser_AlterClauseParser.alterStatementSuffixExecute_return alterStatementSuffixExecute() throws RecognitionException { return gAlterClauseParser.alterStatementSuffixExecute(); }\n \n-\tpublic HiveParser_IdentifiersParser.expressionsInParenthesis_return expressionsInParenthesis(boolean isStruct, boolean forceStruct) throws RecognitionException { return gIdentifiersParser.expressionsInParenthesis(isStruct, forceStruct); }\n+\tpublic HiveParser_ResourcePlanParser.dropMappingStatement_return dropMappingStatement() throws RecognitionException { return gResourcePlanParser.dropMappingStatement(); }\n \n-\tpublic HiveParser_FromClauseParser.tableName_return tableName() throws RecognitionException { return gFromClauseParser.tableName(); }\n+\tpublic HiveParser_SelectClauseParser.selectItem_return selectItem() throws RecognitionException { return gSelectClauseParser.selectItem(); }\n \n-\tpublic HiveParser_IdentifiersParser.caseExpression_return caseExpression() throws RecognitionException { return gIdentifiersParser.caseExpression(); }\n+\tpublic HiveParser_FromClauseParser.partitioningSpec_return partitioningSpec() throws RecognitionException { return gFromClauseParser.partitioningSpec(); }\n \n-\tpublic HiveParser_IdentifiersParser.partitionByClause_return partitionByClause() throws RecognitionException { return gIdentifiersParser.partitionByClause(); }\n+\tpublic HiveParser_FromClauseParser.joinToken_return joinToken() throws RecognitionException { return gFromClauseParser.joinToken(); }\n \n-\tpublic HiveParser_AlterClauseParser.alterStatementSuffixBucketNum_return alterStatementSuffixBucketNum(boolean partition) throws RecognitionException { return gAlterClauseParser.alterStatementSuffixBucketNum(partition); }\n+\tpublic HiveParser_AlterClauseParser.alterStatementSuffixRenameCol_return alterStatementSuffixRenameCol() throws RecognitionException { return gAlterClauseParser.alterStatementSuffixRenameCol(); }\n \n-\tpublic HiveParser_AlterClauseParser.alterStatementSuffixCreateTag_return alterStatementSuffixCreateTag() throws RecognitionException { return gAlterClauseParser.alterStatementSuffixCreateTag(); }\n+\tpublic HiveParser_IdentifiersParser.groupingSetExpressionMultiple_return groupingSetExpressionMultiple() throws RecognitionException { return gIdentifiersParser.groupingSetExpressionMultiple(); }\n \n-\tpublic HiveParser_IdentifiersParser.sql11ReservedKeywordsUsedAsFunctionName_return sql11ReservedKeywordsUsedAsFunctionName() throws RecognitionException { return gIdentifiersParser.sql11ReservedKeywordsUsedAsFunctionName(); }\n+\tpublic HiveParser_AlterClauseParser.fileFormat_return fileFormat() throws RecognitionException { return gAlterClauseParser.fileFormat(); }\n \n-\tpublic HiveParser_IdentifiersParser.intervalQualifiers_return intervalQualifiers() throws RecognitionException { return gIdentifiersParser.intervalQualifiers(); }\n+\tpublic HiveParser_ResourcePlanParser.replaceResourcePlanStatement_return replaceResourcePlanStatement() throws RecognitionException { return gResourcePlanParser.replaceResourcePlanStatement(); }\n \n-\tpublic HiveParser_IdentifiersParser.precedenceAndOperator_return precedenceAndOperator() throws RecognitionException { return gIdentifiersParser.precedenceAndOperator(); }\n+\tpublic HiveParser_AlterClauseParser.alterStatementSuffixAddPartitions_return alterStatementSuffixAddPartitions(boolean table) throws RecognitionException { return gAlterClauseParser.alterStatementSuffixAddPartitions(table); }\n \n-\tpublic HiveParser_IdentifiersParser.precedenceSimilarExpressionPart_return precedenceSimilarExpressionPart(CommonTree t) throws RecognitionException { return gIdentifiersParser.precedenceSimilarExpressionPart(t); }\n+\tpublic HiveParser_ResourcePlanParser.dropResourcePlanStatement_return dropResourcePlanStatement() throws RecognitionException { return gResourcePlanParser.dropResourcePlanStatement(); }\n \n-\tpublic HiveParser_SelectClauseParser.window_value_expression_return window_value_expression() throws RecognitionException { return gSelectClauseParser.window_value_expression(); }\n+\tpublic HiveParser_FromClauseParser.searchCondition_return searchCondition() throws RecognitionException { return gFromClauseParser.searchCondition(); }\n \n-\tpublic HiveParser_IdentifiersParser.partitionSpec_return partitionSpec() throws RecognitionException { return gIdentifiersParser.partitionSpec(); }\n+\tpublic HiveParser_IdentifiersParser.intervalExpression_return intervalExpression() throws RecognitionException { return gIdentifiersParser.intervalExpression(); }\n \n-\tpublic HiveParser_IdentifiersParser.groupByEmpty_return groupByEmpty() throws RecognitionException { return gIdentifiersParser.groupByEmpty(); }\n+\tpublic HiveParser_FromClauseParser.uniqueJoinExpr_return uniqueJoinExpr() throws RecognitionException { return gFromClauseParser.uniqueJoinExpr(); }\n \n-\tpublic HiveParser_PrepareStatementParser.executeParamList_return executeParamList() throws RecognitionException { return gPrepareStatementParser.executeParamList(); }\n+\tpublic HiveParser_AlterClauseParser.alterStatementSuffixCompact_return alterStatementSuffixCompact() throws RecognitionException { return gAlterClauseParser.alterStatementSuffixCompact(); }\n \n-\tpublic HiveParser_SelectClauseParser.window_specification_return window_specification(CommonTree nullTreatment) throws RecognitionException { return gSelectClauseParser.window_specification(nullTreatment); }\n+\tpublic HiveParser_IdentifiersParser.subQueryExpression_return subQueryExpression() throws RecognitionException { return gIdentifiersParser.subQueryExpression(); }\n \n-\tpublic HiveParser_FromClauseParser.atomjoinSource_return atomjoinSource() throws RecognitionException { return gFromClauseParser.atomjoinSource(); }\n+\tpublic HiveParser_FromClauseParser.defaultValue_return defaultValue() throws RecognitionException { return gFromClauseParser.defaultValue(); }\n \n-\tpublic HiveParser_IdentifiersParser.precedenceSimilarOperator_return precedenceSimilarOperator() throws RecognitionException { return gIdentifiersParser.precedenceSimilarOperator(); }\n+\tpublic HiveParser_FromClauseParser.tableBucketSample_return tableBucketSample() throws RecognitionException { return gFromClauseParser.tableBucketSample(); }\n \n-\tpublic HiveParser_IdentifiersParser.orderByClause_return orderByClause() throws RecognitionException { return gIdentifiersParser.orderByClause(); }\n+\tpublic HiveParser_AlterClauseParser.alterDatabaseSuffixSetLocation_return alterDatabaseSuffixSetLocation() throws RecognitionException { return gAlterClauseParser.alterDatabaseSuffixSetLocation(); }\n \n-\tpublic HiveParser_IdentifiersParser.rollupOldSyntax_return rollupOldSyntax() throws RecognitionException { return gIdentifiersParser.rollupOldSyntax(); }\n+\tpublic HiveParser_IdentifiersParser.precedenceDistinctOperator_return precedenceDistinctOperator() throws RecognitionException { return gIdentifiersParser.precedenceDistinctOperator(); }\n \n-\tpublic HiveParser_AlterClauseParser.skewedLocations_return skewedLocations() throws RecognitionException { return gAlterClauseParser.skewedLocations(); }\n+\tpublic HiveParser_IdentifiersParser.floorDateQualifiers_return floorDateQualifiers() throws RecognitionException { return gIdentifiersParser.floorDateQualifiers(); }\n \n-\tpublic HiveParser_IdentifiersParser.precedenceStarExpression_return precedenceStarExpression() throws RecognitionException { return gIdentifiersParser.precedenceStarExpression(); }\n+\tpublic HiveParser_FromClauseParser.valuesTableConstructor_return valuesTableConstructor() throws RecognitionException { return gFromClauseParser.valuesTableConstructor(); }\n \n-\tpublic HiveParser_IdentifiersParser.atomExpression_return atomExpression() throws RecognitionException { return gIdentifiersParser.atomExpression(); }\n+\tpublic HiveParser_IdentifiersParser.null_treatment_return null_treatment() throws RecognitionException { return gIdentifiersParser.null_treatment(); }\n \n-\tpublic HiveParser_IdentifiersParser.precedenceNotOperator_return precedenceNotOperator() throws RecognitionException { return gIdentifiersParser.precedenceNotOperator(); }\n+\tpublic HiveParser_AlterClauseParser.alterDatabaseSuffixSetOwner_return alterDatabaseSuffixSetOwner() throws RecognitionException { return gAlterClauseParser.alterDatabaseSuffixSetOwner(); }\n \n-\tpublic HiveParser_IdentifiersParser.precedenceBitwiseOrExpression_return precedenceBitwiseOrExpression() throws RecognitionException { return gIdentifiersParser.precedenceBitwiseOrExpression(); }\n+\tpublic HiveParser_AlterClauseParser.alterStatementPartitionKeyType_return alterStatementPartitionKeyType() throws RecognitionException { return gAlterClauseParser.alterStatementPartitionKeyType(); }\n \n-\tpublic HiveParser_IdentifiersParser.partitionSelectorSpec_return partitionSelectorSpec() throws RecognitionException { return gIdentifiersParser.partitionSelectorSpec(); }\n+\tpublic HiveParser_IdentifiersParser.partitionByClause_return partitionByClause() throws RecognitionException { return gIdentifiersParser.partitionByClause(); }\n \n-\tpublic HiveParser_AlterClauseParser.alterStatementSuffixCreateBranch_return alterStatementSuffixCreateBranch() throws RecognitionException { return gAlterClauseParser.alterStatementSuffixCreateBranch(); }\n+\tpublic HiveParser_ResourcePlanParser.createResourcePlanStatement_return createResourcePlanStatement() throws RecognitionException { return gResourcePlanParser.createResourcePlanStatement(); }\n \n-\tpublic HiveParser_IdentifiersParser.precedenceBitwiseOrOperator_return precedenceBitwiseOrOperator() throws RecognitionException { return gIdentifiersParser.precedenceBitwiseOrOperator(); }\n+\tpublic HiveParser_AlterClauseParser.alterMaterializedViewStatementSuffix_return alterMaterializedViewStatementSuffix(CommonTree tableNameTree) throws RecognitionException { return gAlterClauseParser.alterMaterializedViewStatementSuffix(tableNameTree); }\n \n-\tpublic HiveParser_FromClauseParser.tableBucketSample_return tableBucketSample() throws RecognitionException { return gFromClauseParser.tableBucketSample(); }\n+\tpublic HiveParser_SelectClauseParser.window_frame_start_boundary_return window_frame_start_boundary() throws RecognitionException { return gSelectClauseParser.window_frame_start_boundary(); }\n \n-\tpublic HiveParser_ResourcePlanParser.triggerLiteral_return triggerLiteral() throws RecognitionException { return gResourcePlanParser.triggerLiteral(); }\n+\tpublic HiveParser_IdentifiersParser.groupingExpressionSingle_return groupingExpressionSingle() throws RecognitionException { return gIdentifiersParser.groupingExpressionSingle(); }\n \n-\tpublic HiveParser_IdentifiersParser.distributeByClause_return distributeByClause() throws RecognitionException { return gIdentifiersParser.distributeByClause(); }\n+\tpublic HiveParser_FromClauseParser.lateralView_return lateralView() throws RecognitionException { return gFromClauseParser.lateralView(); }\n \n-\tpublic HiveParser_IdentifiersParser.functionName_return functionName() throws RecognitionException { return gIdentifiersParser.functionName(); }\n+\tpublic HiveParser_ResourcePlanParser.withReplace_return withReplace() throws RecognitionException { return gResourcePlanParser.withReplace(); }\n \n-\tpublic HiveParser_ResourcePlanParser.dropPoolStatement_return dropPoolStatement() throws RecognitionException { return gResourcePlanParser.dropPoolStatement(); }\n+\tpublic HiveParser_CreateDDLParser.createTableStatement_return createTableStatement() throws RecognitionException { return gCreateDDLParser.createTableStatement(); }\n \n-\tpublic HiveParser_AlterClauseParser.alterStatementSuffixAddConstraint_return alterStatementSuffixAddConstraint() throws RecognitionException { return gAlterClauseParser.alterStatementSuffixAddConstraint(); }\n+\tpublic HiveParser_FromClauseParser.valueRowConstructor_return valueRowConstructor() throws RecognitionException { return gFromClauseParser.valueRowConstructor(); }\n \n-\tpublic HiveParser_ResourcePlanParser.replaceResourcePlanStatement_return replaceResourcePlanStatement() throws RecognitionException { return gResourcePlanParser.replaceResourcePlanStatement(); }\n+\tpublic HiveParser_ResourcePlanParser.rpUnassign_return rpUnassign() throws RecognitionException { return gResourcePlanParser.rpUnassign(); }\n \n-\tpublic HiveParser_AlterClauseParser.alterTblPartitionStatementSuffix_return alterTblPartitionStatementSuffix(boolean partition) throws RecognitionException { return gAlterClauseParser.alterTblPartitionStatementSuffix(partition); }\n+\tpublic HiveParser_AlterClauseParser.snapshotIdOfRef_return snapshotIdOfRef() throws RecognitionException { return gAlterClauseParser.snapshotIdOfRef(); }\n \n-\tpublic HiveParser_IdentifiersParser.expressionWithAlias_return expressionWithAlias() throws RecognitionException { return gIdentifiersParser.expressionWithAlias(); }\n+\tpublic HiveParser_CreateDDLParser.createDataConnectorStatement_return createDataConnectorStatement() throws RecognitionException { return gCreateDDLParser.createDataConnectorStatement(); }\n \n-\tpublic HiveParser_IdentifiersParser.precedenceDistinctOperator_return precedenceDistinctOperator() throws RecognitionException { return gIdentifiersParser.precedenceDistinctOperator(); }\n+\tpublic HiveParser_AlterClauseParser.alterStatementSuffixRename_return alterStatementSuffixRename(boolean table) throws RecognitionException { return gAlterClauseParser.alterStatementSuffixRename(table); }\n \n-\tpublic HiveParser_CreateDDLParser.dataConnectorComment_return dataConnectorComment() throws RecognitionException { return gCreateDDLParser.dataConnectorComment(); }\n+\tpublic HiveParser_FromClauseParser.uniqueJoinTableSource_return uniqueJoinTableSource() throws RecognitionException { return gFromClauseParser.uniqueJoinTableSource(); }\n \n-\tpublic HiveParser_IdentifiersParser.principalIdentifier_return principalIdentifier() throws RecognitionException { return gIdentifiersParser.principalIdentifier(); }\n+\tpublic HiveParser_IdentifiersParser.timestampLiteral_return timestampLiteral() throws RecognitionException { return gIdentifiersParser.timestampLiteral(); }\n \n-\tpublic HiveParser_AlterClauseParser.alterStatementSuffixDropConstraint_return alterStatementSuffixDropConstraint() throws RecognitionException { return gAlterClauseParser.alterStatementSuffixDropConstraint(); }\n+\tpublic HiveParser_IdentifiersParser.trimFunction_return trimFunction() throws RecognitionException { return gIdentifiersParser.trimFunction(); }\n \n-\tpublic HiveParser_FromClauseParser.defaultValue_return defaultValue() throws RecognitionException { return gFromClauseParser.defaultValue(); }\n+\tpublic HiveParser_IdentifiersParser.precedenceSimilarExpressionAtom_return precedenceSimilarExpressionAtom(CommonTree t) throws RecognitionException { return gIdentifiersParser.precedenceSimilarExpressionAtom(t); }\n \n-\tpublic HiveParser_IdentifiersParser.precedencePlusExpression_return precedencePlusExpression() throws RecognitionException { return gIdentifiersParser.precedencePlusExpression(); }\n+\tpublic HiveParser_AlterClauseParser.alterStatementSuffixCreateTag_return alterStatementSuffixCreateTag() throws RecognitionException { return gAlterClauseParser.alterStatementSuffixCreateTag(); }\n \n-\tpublic HiveParser_FromClauseParser.tableOrColumn_return tableOrColumn() throws RecognitionException { return gFromClauseParser.tableOrColumn(); }\n+\tpublic HiveParser_IdentifiersParser.precedenceAndOperator_return precedenceAndOperator() throws RecognitionException { return gIdentifiersParser.precedenceAndOperator(); }\n \n-\tpublic HiveParser_IdentifiersParser.timestampLiteral_return timestampLiteral() throws RecognitionException { return gIdentifiersParser.timestampLiteral(); }\n+\tpublic HiveParser_FromClauseParser.tableName_return tableName() throws RecognitionException { return gFromClauseParser.tableName(); }\n \n-\tpublic HiveParser_IdentifiersParser.intervalLiteral_return intervalLiteral() throws RecognitionException { return gIdentifiersParser.intervalLiteral(); }\n+\tpublic HiveParser_IdentifiersParser.groupByClause_return groupByClause() throws RecognitionException { return gIdentifiersParser.groupByClause(); }\n \n-\tpublic HiveParser_AlterClauseParser.alterDataConnectorSuffixSetOwner_return alterDataConnectorSuffixSetOwner() throws RecognitionException { return gAlterClauseParser.alterDataConnectorSuffixSetOwner(); }\n+\tpublic HiveParser_FromClauseParser.partitionedTableFunction_return partitionedTableFunction() throws RecognitionException { return gFromClauseParser.partitionedTableFunction(); }\n \n-\tpublic HiveParser_IdentifiersParser.groupByClause_return groupByClause() throws RecognitionException { return gIdentifiersParser.groupByClause(); }\n+\tpublic HiveParser_ResourcePlanParser.enable_return enable() throws RecognitionException { return gResourcePlanParser.enable(); }\n \n-\tpublic HiveParser_AlterClauseParser.alterDatabaseSuffixSetManagedLocation_return alterDatabaseSuffixSetManagedLocation() throws RecognitionException { return gAlterClauseParser.alterDatabaseSuffixSetManagedLocation(); }\n+\tpublic HiveParser_IdentifiersParser.subQuerySelectorOperator_return subQuerySelectorOperator() throws RecognitionException { return gIdentifiersParser.subQuerySelectorOperator(); }\n \n-\tpublic HiveParser_AlterClauseParser.snapshotIdOfRef_return snapshotIdOfRef() throws RecognitionException { return gAlterClauseParser.snapshotIdOfRef(); }\n+\tpublic HiveParser_IdentifiersParser.precedenceEqualOperator_return precedenceEqualOperator() throws RecognitionException { return gIdentifiersParser.precedenceEqualOperator(); }\n \n-\tpublic HiveParser_FromClauseParser.valuesClause_return valuesClause() throws RecognitionException { return gFromClauseParser.valuesClause(); }\n+\tpublic HiveParser_IdentifiersParser.timestampLocalTZLiteral_return timestampLocalTZLiteral() throws RecognitionException { return gIdentifiersParser.timestampLocalTZLiteral(); }\n \n-\tpublic HiveParser_CreateDDLParser.createTableStatement_return createTableStatement() throws RecognitionException { return gCreateDDLParser.createTableStatement(); }\n+\tpublic HiveParser_AlterClauseParser.alterMaterializedViewSuffixRebuild_return alterMaterializedViewSuffixRebuild(CommonTree tableNameTree) throws RecognitionException { return gAlterClauseParser.alterMaterializedViewSuffixRebuild(tableNameTree); }\n \n-\tpublic HiveParser_IdentifiersParser.floorDateQualifiers_return floorDateQualifiers() throws RecognitionException { return gIdentifiersParser.floorDateQualifiers(); }\n+\tpublic HiveParser_AlterClauseParser.alterStatementSuffixUpdateStatsCol_return alterStatementSuffixUpdateStatsCol(boolean partition) throws RecognitionException { return gAlterClauseParser.alterStatementSuffixUpdateStatsCol(partition); }\n \n-\tpublic HiveParser_AlterClauseParser.fileFormat_return fileFormat() throws RecognitionException { return gAlterClauseParser.fileFormat(); }\n+\tpublic HiveParser_SelectClauseParser.selectExpression_return selectExpression() throws RecognitionException { return gSelectClauseParser.selectExpression(); }\n \n-\tpublic HiveParser_AlterClauseParser.alterStatementSuffixAddPartitions_return alterStatementSuffixAddPartitions(boolean table) throws RecognitionException { return gAlterClauseParser.alterStatementSuffixAddPartitions(table); }\n+\tpublic HiveParser_AlterClauseParser.alterStatement_return alterStatement() throws RecognitionException { return gAlterClauseParser.alterStatement(); }\n \n-\tpublic HiveParser_ResourcePlanParser.triggerAtomExpression_return triggerAtomExpression() throws RecognitionException { return gResourcePlanParser.triggerAtomExpression(); }\n+\tpublic HiveParser_AlterClauseParser.alterDatabaseStatementSuffix_return alterDatabaseStatementSuffix() throws RecognitionException { return gAlterClauseParser.alterDatabaseStatementSuffix(); }\n \n-\tpublic HiveParser_SelectClauseParser.window_clause_return window_clause() throws RecognitionException { return gSelectClauseParser.window_clause(); }\n+\tpublic HiveParser_IdentifiersParser.intervalQualifiers_return intervalQualifiers() throws RecognitionException { return gIdentifiersParser.intervalQualifiers(); }\n \n-\tpublic HiveParser_SelectClauseParser.window_frame_start_boundary_return window_frame_start_boundary() throws RecognitionException { return gSelectClauseParser.window_frame_start_boundary(); }\n+\tpublic HiveParser_IdentifiersParser.rollupOldSyntax_return rollupOldSyntax() throws RecognitionException { return gIdentifiersParser.rollupOldSyntax(); }\n \n-\tpublic HiveParser_ResourcePlanParser.createResourcePlanStatement_return createResourcePlanStatement() throws RecognitionException { return gResourcePlanParser.createResourcePlanStatement(); }\n+\tpublic HiveParser_AlterClauseParser.retentionOfSnapshots_return retentionOfSnapshots() throws RecognitionException { return gAlterClauseParser.retentionOfSnapshots(); }\n \n-\tpublic HiveParser_AlterClauseParser.alterDatabaseSuffixSetLocation_return alterDatabaseSuffixSetLocation() throws RecognitionException { return gAlterClauseParser.alterDatabaseSuffixSetLocation(); }\n+\tpublic HiveParser_FromClauseParser.fromClause_return fromClause() throws RecognitionException { return gFromClauseParser.fromClause(); }\n \n-\tpublic HiveParser_ResourcePlanParser.createPoolStatement_return createPoolStatement() throws RecognitionException { return gResourcePlanParser.createPoolStatement(); }\n+\tpublic HiveParser_AlterClauseParser.compactPool_return compactPool() throws RecognitionException { return gAlterClauseParser.compactPool(); }\n \n-\tpublic HiveParser_FromClauseParser.fromSource_return fromSource() throws RecognitionException { return gFromClauseParser.fromSource(); }\n+\tpublic HiveParser_ResourcePlanParser.poolPath_return poolPath() throws RecognitionException { return gResourcePlanParser.poolPath(); }\n \n-\tpublic HiveParser_AlterClauseParser.alterStatementSuffixUpdateColumns_return alterStatementSuffixUpdateColumns() throws RecognitionException { return gAlterClauseParser.alterStatementSuffixUpdateColumns(); }\n+\tpublic HiveParser_IdentifiersParser.precedenceUnarySuffixExpression_return precedenceUnarySuffixExpression() throws RecognitionException { return gIdentifiersParser.precedenceUnarySuffixExpression(); }\n \n-\tpublic HiveParser_IdentifiersParser.intervalValue_return intervalValue() throws RecognitionException { return gIdentifiersParser.intervalValue(); }\n+\tpublic HiveParser_FromClauseParser.tableAlias_return tableAlias() throws RecognitionException { return gFromClauseParser.tableAlias(); }\n \n-\tpublic HiveParser_AlterClauseParser.alterStatementSuffixAddPartitionsElement_return alterStatementSuffixAddPartitionsElement() throws RecognitionException { return gAlterClauseParser.alterStatementSuffixAddPartitionsElement(); }\n+\tpublic HiveParser_IdentifiersParser.precedenceEqualExpression_return precedenceEqualExpression() throws RecognitionException { return gIdentifiersParser.precedenceEqualExpression(); }\n \n-\tpublic HiveParser_AlterClauseParser.alterStatementSuffixLocation_return alterStatementSuffixLocation(boolean partition) throws RecognitionException { return gAlterClauseParser.alterStatementSuffixLocation(partition); }\n+\tpublic HiveParser_IdentifiersParser.expression_return expression() throws RecognitionException { return gIdentifiersParser.expression(); }\n \n-\tpublic HiveParser_AlterClauseParser.alterStatementSuffixTouch_return alterStatementSuffixTouch() throws RecognitionException { return gAlterClauseParser.alterStatementSuffixTouch(); }\n+\tpublic HiveParser_IdentifiersParser.precedenceAmpersandExpression_return precedenceAmpersandExpression() throws RecognitionException { return gIdentifiersParser.precedenceAmpersandExpression(); }\n \n-\tpublic HiveParser_ResourcePlanParser.triggerActionExpressionStandalone_return triggerActionExpressionStandalone() throws RecognitionException { return gResourcePlanParser.triggerActionExpressionStandalone(); }\n+\tpublic HiveParser_IdentifiersParser.castExpression_return castExpression() throws RecognitionException { return gIdentifiersParser.castExpression(); }\n \n-\tpublic HiveParser_FromClauseParser.partitioningSpec_return partitioningSpec() throws RecognitionException { return gFromClauseParser.partitioningSpec(); }\n+\tpublic HiveParser_ResourcePlanParser.triggerActionExpressionStandalone_return triggerActionExpressionStandalone() throws RecognitionException { return gResourcePlanParser.triggerActionExpressionStandalone(); }\n \n-\tpublic HiveParser_IdentifiersParser.columnRefOrderNotInParenthesis_return columnRefOrderNotInParenthesis() throws RecognitionException { return gIdentifiersParser.columnRefOrderNotInParenthesis(); }\n+\tpublic HiveParser_IdentifiersParser.sql11ReservedKeywordsUsedAsFunctionName_return sql11ReservedKeywordsUsedAsFunctionName() throws RecognitionException { return gIdentifiersParser.sql11ReservedKeywordsUsedAsFunctionName(); }\n \n-\tpublic HiveParser_IdentifiersParser.precedenceSimilarExpressionIn_return precedenceSimilarExpressionIn(CommonTree t) throws RecognitionException { return gIdentifiersParser.precedenceSimilarExpressionIn(t); }\n+\tpublic HiveParser_IdentifiersParser.descFuncNames_return descFuncNames() throws RecognitionException { return gIdentifiersParser.descFuncNames(); }\n \n-\tpublic HiveParser_FromClauseParser.joinToken_return joinToken() throws RecognitionException { return gFromClauseParser.joinToken(); }\n+\tpublic HiveParser_IdentifiersParser.precedenceBitwiseOrOperator_return precedenceBitwiseOrOperator() throws RecognitionException { return gIdentifiersParser.precedenceBitwiseOrOperator(); }\n \n-\tpublic HiveParser_ResourcePlanParser.rpUnassign_return rpUnassign() throws RecognitionException { return gResourcePlanParser.rpUnassign(); }\n+\tpublic HiveParser_PrepareStatementParser.executeParamList_return executeParamList() throws RecognitionException { return gPrepareStatementParser.executeParamList(); }\n \n-\tpublic HiveParser_AlterClauseParser.alterStatementSuffixSetPartSpec_return alterStatementSuffixSetPartSpec() throws RecognitionException { return gAlterClauseParser.alterStatementSuffixSetPartSpec(); }\n+\tpublic HiveParser_AlterClauseParser.alterStatementSuffixSkewedby_return alterStatementSuffixSkewedby() throws RecognitionException { return gAlterClauseParser.alterStatementSuffixSkewedby(); }\n \n-\tpublic HiveParser_AlterClauseParser.alterMaterializedViewSuffixRewrite_return alterMaterializedViewSuffixRewrite(CommonTree tableNameTree) throws RecognitionException { return gAlterClauseParser.alterMaterializedViewSuffixRewrite(tableNameTree); }\n+\tpublic HiveParser_IdentifiersParser.precedenceUnaryPrefixExpression_return precedenceUnaryPrefixExpression() throws RecognitionException { return gIdentifiersParser.precedenceUnaryPrefixExpression(); }\n \n-\tpublic HiveParser_IdentifiersParser.timeUnitQualifiers_return timeUnitQualifiers() throws RecognitionException { return gIdentifiersParser.timeUnitQualifiers(); }\n+\tpublic HiveParser_IdentifiersParser.groupby_expression_return groupby_expression() throws RecognitionException { return gIdentifiersParser.groupby_expression(); }\n \n-\tpublic HiveParser_FromClauseParser.uniqueJoinSource_return uniqueJoinSource() throws RecognitionException { return gFromClauseParser.uniqueJoinSource(); }\n+\tpublic HiveParser_AlterClauseParser.blocking_return blocking() throws RecognitionException { return gAlterClauseParser.blocking(); }\n \n-\tpublic HiveParser_FromClauseParser.expressionList_return expressionList() throws RecognitionException { return gFromClauseParser.expressionList(); }\n+\tpublic HiveParser_IdentifiersParser.sortByClause_return sortByClause() throws RecognitionException { return gIdentifiersParser.sortByClause(); }\n \n-\tpublic HiveParser_ResourcePlanParser.poolAssignList_return poolAssignList() throws RecognitionException { return gResourcePlanParser.poolAssignList(); }\n+\tpublic HiveParser_IdentifiersParser.precedenceBitwiseXorExpression_return precedenceBitwiseXorExpression() throws RecognitionException { return gIdentifiersParser.precedenceBitwiseXorExpression(); }\n \n-\tpublic HiveParser_AlterClauseParser.alterStatementSuffixClusterbySortby_return alterStatementSuffixClusterbySortby() throws RecognitionException { return gAlterClauseParser.alterStatementSuffixClusterbySortby(); }\n+\tpublic HiveParser_AlterClauseParser.alterStatementSuffixSetPartSpec_return alterStatementSuffixSetPartSpec() throws RecognitionException { return gAlterClauseParser.alterStatementSuffixSetPartSpec(); }\n \n-\tpublic HiveParser_ResourcePlanParser.unmanaged_return unmanaged() throws RecognitionException { return gResourcePlanParser.unmanaged(); }\n+\tpublic HiveParser_IdentifiersParser.precedenceOrOperator_return precedenceOrOperator() throws RecognitionException { return gIdentifiersParser.precedenceOrOperator(); }\n \n-\tpublic HiveParser_IdentifiersParser.precedenceSimilarExpression_return precedenceSimilarExpression() throws RecognitionException { return gIdentifiersParser.precedenceSimilarExpression(); }\n+\tpublic HiveParser_IdentifiersParser.precedenceSimilarExpressionQuantifierPredicate_return precedenceSimilarExpressionQuantifierPredicate(CommonTree t) throws RecognitionException { return gIdentifiersParser.precedenceSimilarExpressionQuantifierPredicate(t); }\n \n-\tpublic HiveParser_CreateDDLParser.dropDataConnectorStatement_return dropDataConnectorStatement() throws RecognitionException { return gCreateDDLParser.dropDataConnectorStatement(); }\n+\tpublic HiveParser_FromClauseParser.joinSource_return joinSource() throws RecognitionException { return gFromClauseParser.joinSource(); }\n \n-\tpublic HiveParser_AlterClauseParser.alterStatementSuffixRenameCol_return alterStatementSuffixRenameCol() throws RecognitionException { return gAlterClauseParser.alterStatementSuffixRenameCol(); }\n+\tpublic HiveParser_AlterClauseParser.alterStatementSuffixSetOwner_return alterStatementSuffixSetOwner() throws RecognitionException { return gAlterClauseParser.alterStatementSuffixSetOwner(); }\n \n-\tpublic HiveParser_IdentifiersParser.timestampLocalTZLiteral_return timestampLocalTZLiteral() throws RecognitionException { return gIdentifiersParser.timestampLocalTZLiteral(); }\n+\tpublic HiveParser_IdentifiersParser.partitionSpec_return partitionSpec() throws RecognitionException { return gIdentifiersParser.partitionSpec(); }\n \n-\tpublic HiveParser_ResourcePlanParser.withReplace_return withReplace() throws RecognitionException { return gResourcePlanParser.withReplace(); }\n+\tpublic HiveParser_IdentifiersParser.whenExpression_return whenExpression() throws RecognitionException { return gIdentifiersParser.whenExpression(); }\n \n-\tpublic HiveParser_ResourcePlanParser.disable_return disable() throws RecognitionException { return gResourcePlanParser.disable(); }\n+\tpublic HiveParser_AlterClauseParser.alterStatementSuffixProperties_return alterStatementSuffixProperties() throws RecognitionException { return gAlterClauseParser.alterStatementSuffixProperties(); }\n \n-\tpublic HiveParser_AlterClauseParser.alterStatementChangeColPosition_return alterStatementChangeColPosition() throws RecognitionException { return gAlterClauseParser.alterStatementChangeColPosition(); }\n+\tpublic HiveParser_IdentifiersParser.stringLiteralSequence_return stringLiteralSequence() throws RecognitionException { return gIdentifiersParser.stringLiteralSequence(); }\n \n-\tpublic HiveParser_IdentifiersParser.rollupStandard_return rollupStandard() throws RecognitionException { return gIdentifiersParser.rollupStandard(); }\n+\tpublic HiveParser_ResourcePlanParser.rpAssign_return rpAssign() throws RecognitionException { return gResourcePlanParser.rpAssign(); }\n \n-\tpublic HiveParser_IdentifiersParser.precedenceSimilarExpressionQuantifierPredicate_return precedenceSimilarExpressionQuantifierPredicate(CommonTree t) throws RecognitionException { return gIdentifiersParser.precedenceSimilarExpressionQuantifierPredicate(t); }\n+\tpublic HiveParser_FromClauseParser.viewName_return viewName() throws RecognitionException { return gFromClauseParser.viewName(); }\n \n-\tpublic HiveParser_IdentifiersParser.prepareStmtParam_return prepareStmtParam() throws RecognitionException { return gIdentifiersParser.prepareStmtParam(); }\n+\tpublic HiveParser_SelectClauseParser.selectClause_return selectClause() throws RecognitionException { return gSelectClauseParser.selectClause(); }\n \n-\tpublic HiveParser_AlterClauseParser.alterStatementSuffixSkewedby_return alterStatementSuffixSkewedby() throws RecognitionException { return gAlterClauseParser.alterStatementSuffixSkewedby(); }\n+\tpublic HiveParser_FromClauseParser.tableAllColumns_return tableAllColumns() throws RecognitionException { return gFromClauseParser.tableAllColumns(); }\n \n-\tpublic HiveParser_IdentifiersParser.expressions_return expressions() throws RecognitionException { return gIdentifiersParser.expressions(); }\n+\tpublic HiveParser_AlterClauseParser.alterStatementSuffixMergeFiles_return alterStatementSuffixMergeFiles(boolean partition) throws RecognitionException { return gAlterClauseParser.alterStatementSuffixMergeFiles(partition); }\n \n-\tpublic HiveParser_ResourcePlanParser.alterResourcePlanStatement_return alterResourcePlanStatement() throws RecognitionException { return gResourcePlanParser.alterResourcePlanStatement(); }\n+\tpublic HiveParser_IdentifiersParser.qualifyClause_return qualifyClause() throws RecognitionException { return gIdentifiersParser.qualifyClause(); }\n \n-\tpublic HiveParser_AlterClauseParser.skewedLocationsList_return skewedLocationsList() throws RecognitionException { return gAlterClauseParser.skewedLocationsList(); }\n+\tpublic HiveParser_AlterClauseParser.skewedLocationMap_return skewedLocationMap() throws RecognitionException { return gAlterClauseParser.skewedLocationMap(); }\n \n-\tpublic HiveParser_AlterClauseParser.alterViewStatementSuffix_return alterViewStatementSuffix() throws RecognitionException { return gAlterClauseParser.alterViewStatementSuffix(); }\n+\tpublic HiveParser_IdentifiersParser.expressionsNotInParenthesis_return expressionsNotInParenthesis(boolean isStruct, boolean forceStruct) throws RecognitionException { return gIdentifiersParser.expressionsNotInParenthesis(isStruct, forceStruct); }\n \n-\tpublic HiveParser_IdentifiersParser.quantifierType_return quantifierType() throws RecognitionException { return gIdentifiersParser.quantifierType(); }\n+\tpublic HiveParser_IdentifiersParser.intervalLiteral_return intervalLiteral() throws RecognitionException { return gIdentifiersParser.intervalLiteral(); }\n \n-\tpublic HiveParser_AlterClauseParser.alterStatementSuffixArchive_return alterStatementSuffixArchive() throws RecognitionException { return gAlterClauseParser.alterStatementSuffixArchive(); }\n+\tpublic HiveParser_AlterClauseParser.alterStatementSuffixAddCol_return alterStatementSuffixAddCol() throws RecognitionException { return gAlterClauseParser.alterStatementSuffixAddCol(); }\n \n-\tpublic HiveParser_AlterClauseParser.skewedLocationMap_return skewedLocationMap() throws RecognitionException { return gAlterClauseParser.skewedLocationMap(); }\n+\tpublic HiveParser_ResourcePlanParser.activate_return activate() throws RecognitionException { return gResourcePlanParser.activate(); }\n \n-\tpublic HiveParser_IdentifiersParser.extractExpression_return extractExpression() throws RecognitionException { return gIdentifiersParser.extractExpression(); }\n+\tpublic HiveParser_AlterClauseParser.alterTblPartitionStatementSuffixSkewedLocation_return alterTblPartitionStatementSuffixSkewedLocation() throws RecognitionException { return gAlterClauseParser.alterTblPartitionStatementSuffixSkewedLocation(); }\n \n-\tpublic HiveParser_IdentifiersParser.subQueryExpression_return subQueryExpression() throws RecognitionException { return gIdentifiersParser.subQueryExpression(); }\n+\tpublic HiveParser_IdentifiersParser.partitionSelectorOperator_return partitionSelectorOperator() throws RecognitionException { return gIdentifiersParser.partitionSelectorOperator(); }\n \n-\tpublic HiveParser_FromClauseParser.valueRowConstructor_return valueRowConstructor() throws RecognitionException { return gFromClauseParser.valueRowConstructor(); }\n+\tpublic HiveParser_IdentifiersParser.atomExpression_return atomExpression() throws RecognitionException { return gIdentifiersParser.atomExpression(); }\n \n-\tpublic HiveParser_IdentifiersParser.booleanValue_return booleanValue() throws RecognitionException { return gIdentifiersParser.booleanValue(); }\n+\tpublic HiveParser_IdentifiersParser.rollupStandard_return rollupStandard() throws RecognitionException { return gIdentifiersParser.rollupStandard(); }\n \n-\tpublic HiveParser_IdentifiersParser.nonReserved_return nonReserved() throws RecognitionException { return gIdentifiersParser.nonReserved(); }\n+\tpublic HiveParser_IdentifiersParser.distributeByClause_return distributeByClause() throws RecognitionException { return gIdentifiersParser.distributeByClause(); }\n \n-\tpublic HiveParser_FromClauseParser.subQuerySource_return subQuerySource() throws RecognitionException { return gFromClauseParser.subQuerySource(); }\n+\tpublic HiveParser_IdentifiersParser.functionName_return functionName() throws RecognitionException { return gIdentifiersParser.functionName(); }\n \n-\tpublic HiveParser_IdentifiersParser.sysFuncNames_return sysFuncNames() throws RecognitionException { return gIdentifiersParser.sysFuncNames(); }\n+\tpublic HiveParser_AlterClauseParser.alterMaterializedViewSuffixRewrite_return alterMaterializedViewSuffixRewrite(CommonTree tableNameTree) throws RecognitionException { return gAlterClauseParser.alterMaterializedViewSuffixRewrite(tableNameTree); }\n \n-\tpublic HiveParser_ResourcePlanParser.rpAssignList_return rpAssignList() throws RecognitionException { return gResourcePlanParser.rpAssignList(); }\n+\tpublic HiveParser_FromClauseParser.asOfClause_return asOfClause() throws RecognitionException { return gFromClauseParser.asOfClause(); }\n \n-\tpublic HiveParser_IdentifiersParser.precedenceSimilarExpressionAtom_return precedenceSimilarExpressionAtom(CommonTree t) throws RecognitionException { return gIdentifiersParser.precedenceSimilarExpressionAtom(t); }\n+\tpublic HiveParser_IdentifiersParser.columnRefOrderNotInParenthesis_return columnRefOrderNotInParenthesis() throws RecognitionException { return gIdentifiersParser.columnRefOrderNotInParenthesis(); }\n \n-\tpublic HiveParser_SelectClauseParser.window_range_expression_return window_range_expression() throws RecognitionException { return gSelectClauseParser.window_range_expression(); }\n+\tpublic HiveParser_IdentifiersParser.timeUnitQualifiers_return timeUnitQualifiers() throws RecognitionException { return gIdentifiersParser.timeUnitQualifiers(); }\n \n-\tpublic HiveParser_SelectClauseParser.selectItem_return selectItem() throws RecognitionException { return gSelectClauseParser.selectItem(); }\n+\tpublic HiveParser_ResourcePlanParser.triggerExpressionStandalone_return triggerExpressionStandalone() throws RecognitionException { return gResourcePlanParser.triggerExpressionStandalone(); }\n \n-\tpublic HiveParser_IdentifiersParser.precedenceOrExpression_return precedenceOrExpression() throws RecognitionException { return gIdentifiersParser.precedenceOrExpression(); }\n+\tpublic HiveParser_IdentifiersParser.isCondition_return isCondition() throws RecognitionException { return gIdentifiersParser.isCondition(); }\n \n-\tpublic HiveParser_AlterClauseParser.alterStatementSuffixRename_return alterStatementSuffixRename(boolean table) throws RecognitionException { return gAlterClauseParser.alterStatementSuffixRename(table); }\n+\tpublic HiveParser_AlterClauseParser.alterStatementSuffixConvert_return alterStatementSuffixConvert() throws RecognitionException { return gAlterClauseParser.alterStatementSuffixConvert(); }\n \n-\tpublic HiveParser_AlterClauseParser.partitionLocation_return partitionLocation() throws RecognitionException { return gAlterClauseParser.partitionLocation(); }\n+\tpublic HiveParser_AlterClauseParser.alterStatementSuffixUnArchive_return alterStatementSuffixUnArchive() throws RecognitionException { return gAlterClauseParser.alterStatementSuffixUnArchive(); }\n \n-\tpublic HiveParser_ResourcePlanParser.alterTriggerStatement_return alterTriggerStatement() throws RecognitionException { return gResourcePlanParser.alterTriggerStatement(); }\n+\tpublic HiveParser_FromClauseParser.tableOrColumn_return tableOrColumn() throws RecognitionException { return gFromClauseParser.tableOrColumn(); }\n \n-\tpublic HiveParser_FromClauseParser.searchCondition_return searchCondition() throws RecognitionException { return gFromClauseParser.searchCondition(); }\n+\tpublic HiveParser_CreateDDLParser.dropDataConnectorStatement_return dropDataConnectorStatement() throws RecognitionException { return gCreateDDLParser.dropDataConnectorStatement(); }\n \n-\tpublic HiveParser_FromClauseParser.joinSourcePart_return joinSourcePart() throws RecognitionException { return gFromClauseParser.joinSourcePart(); }\n+\tpublic HiveParser_SelectClauseParser.window_frame_boundary_return window_frame_boundary() throws RecognitionException { return gSelectClauseParser.window_frame_boundary(); }\n \n-\tpublic HiveParser_IdentifiersParser.precedenceSimilarExpressionMain_return precedenceSimilarExpressionMain() throws RecognitionException { return gIdentifiersParser.precedenceSimilarExpressionMain(); }\n+\tpublic HiveParser_ResourcePlanParser.dropPoolStatement_return dropPoolStatement() throws RecognitionException { return gResourcePlanParser.dropPoolStatement(); }\n \n-\tpublic HiveParser_IdentifiersParser.sortByClause_return sortByClause() throws RecognitionException { return gIdentifiersParser.sortByClause(); }\n+\tpublic HiveParser_IdentifiersParser.functionIdentifier_return functionIdentifier() throws RecognitionException { return gIdentifiersParser.functionIdentifier(); }\n \n-\tpublic HiveParser_IdentifiersParser.columnRefOrderInParenthesis_return columnRefOrderInParenthesis() throws RecognitionException { return gIdentifiersParser.columnRefOrderInParenthesis(); }\n+\tpublic HiveParser_IdentifiersParser.precedenceSimilarExpressionMain_return precedenceSimilarExpressionMain() throws RecognitionException { return gIdentifiersParser.precedenceSimilarExpressionMain(); }\n \n-\tpublic HiveParser_IdentifiersParser.precedenceUnaryPrefixExpression_return precedenceUnaryPrefixExpression() throws RecognitionException { return gIdentifiersParser.precedenceUnaryPrefixExpression(); }\n+\tpublic HiveParser_AlterClauseParser.partitionLocation_return partitionLocation() throws RecognitionException { return gAlterClauseParser.partitionLocation(); }\n \n-\tpublic HiveParser_IdentifiersParser.constant_return constant() throws RecognitionException { return gIdentifiersParser.constant(); }\n+\tpublic HiveParser_FromClauseParser.aliasList_return aliasList() throws RecognitionException { return gFromClauseParser.aliasList(); }\n \n-\tpublic HiveParser_IdentifiersParser.castExpression_return castExpression() throws RecognitionException { return gIdentifiersParser.castExpression(); }\n+\tpublic HiveParser_FromClauseParser.uniqueJoinSource_return uniqueJoinSource() throws RecognitionException { return gFromClauseParser.uniqueJoinSource(); }\n \n-\tpublic HiveParser_AlterClauseParser.alterStatementSuffixAddCol_return alterStatementSuffixAddCol() throws RecognitionException { return gAlterClauseParser.alterStatementSuffixAddCol(); }\n+\tpublic HiveParser_AlterClauseParser.alterStatementSuffixDropConstraint_return alterStatementSuffixDropConstraint() throws RecognitionException { return gAlterClauseParser.alterStatementSuffixDropConstraint(); }\n \n-\tpublic HiveParser_AlterClauseParser.alterStatementSuffixMergeFiles_return alterStatementSuffixMergeFiles(boolean partition) throws RecognitionException { return gAlterClauseParser.alterStatementSuffixMergeFiles(partition); }\n+\tpublic HiveParser_IdentifiersParser.timeQualifiers_return timeQualifiers() throws RecognitionException { return gIdentifiersParser.timeQualifiers(); }\n \n-\tpublic HiveParser_IdentifiersParser.booleanValueTok_return booleanValueTok() throws RecognitionException { return gIdentifiersParser.booleanValueTok(); }\n+\tpublic HiveParser_IdentifiersParser.precedenceRegexpOperator_return precedenceRegexpOperator() throws RecognitionException { return gIdentifiersParser.precedenceRegexpOperator(); }\n \n-\tpublic HiveParser_IdentifiersParser.precedenceAmpersandOperator_return precedenceAmpersandOperator() throws RecognitionException { return gIdentifiersParser.precedenceAmpersandOperator(); }\n+\tpublic HiveParser_AlterClauseParser.alterTblPartitionStatementSuffix_return alterTblPartitionStatementSuffix(boolean partition) throws RecognitionException { return gAlterClauseParser.alterTblPartitionStatementSuffix(partition); }\n \n \tpublic HiveParser_AlterClauseParser.alterViewSuffixProperties_return alterViewSuffixProperties() throws RecognitionException { return gAlterClauseParser.alterViewSuffixProperties(); }\n \n-\tpublic HiveParser_AlterClauseParser.alterStatementSuffixDropBranch_return alterStatementSuffixDropBranch() throws RecognitionException { return gAlterClauseParser.alterStatementSuffixDropBranch(); }\n-\n-\tpublic HiveParser_FromClauseParser.valuesSource_return valuesSource() throws RecognitionException { return gFromClauseParser.valuesSource(); }\n+\tpublic HiveParser_IdentifiersParser.clusterByClause_return clusterByClause() throws RecognitionException { return gIdentifiersParser.clusterByClause(); }\n \n-\tpublic HiveParser_IdentifiersParser.parameterIdx_return parameterIdx() throws RecognitionException { return gIdentifiersParser.parameterIdx(); }\n+\tpublic HiveParser_AlterClauseParser.alterStatementSuffixTouch_return alterStatementSuffixTouch() throws RecognitionException { return gAlterClauseParser.alterStatementSuffixTouch(); }\n \n-\tpublic HiveParser_IdentifiersParser.precedenceSimilarExpressionPartNot_return precedenceSimilarExpressionPartNot(CommonTree t) throws RecognitionException { return gIdentifiersParser.precedenceSimilarExpressionPartNot(t); }\n+\tpublic HiveParser_IdentifiersParser.precedenceSimilarOperator_return precedenceSimilarOperator() throws RecognitionException { return gIdentifiersParser.precedenceSimilarOperator(); }\n \n-\tpublic HiveParser_FromClauseParser.uniqueJoinTableSource_return uniqueJoinTableSource() throws RecognitionException { return gFromClauseParser.uniqueJoinTableSource(); }\n+\tpublic HiveParser_AlterClauseParser.alterStatementSuffixDropPartitions_return alterStatementSuffixDropPartitions(boolean table) throws RecognitionException { return gAlterClauseParser.alterStatementSuffixDropPartitions(table); }\n \n-\tpublic HiveParser_ResourcePlanParser.enable_return enable() throws RecognitionException { return gResourcePlanParser.enable(); }\n+\tpublic HiveParser_ResourcePlanParser.resourcePlanDdlStatements_return resourcePlanDdlStatements() throws RecognitionException { return gResourcePlanParser.resourcePlanDdlStatements(); }\n \n-\tpublic HiveParser_IdentifiersParser.expression_return expression() throws RecognitionException { return gIdentifiersParser.expression(); }\n+\tpublic HiveParser_IdentifiersParser.extractExpression_return extractExpression() throws RecognitionException { return gIdentifiersParser.extractExpression(); }\n \n-\tpublic HiveParser_FromClauseParser.lateralView_return lateralView() throws RecognitionException { return gFromClauseParser.lateralView(); }\n+\tpublic HiveParser_AlterClauseParser.alterStatementSuffixExchangePartition_return alterStatementSuffixExchangePartition() throws RecognitionException { return gAlterClauseParser.alterStatementSuffixExchangePartition(); }\n \n \tpublic final boolean synpred18_HiveParser() {\n \t\tstate.backtracking++;\n \t\tint start = input.mark();\n \t\ttry {\n \t\t\tsynpred18_HiveParser_fragment(); // can never throw exception\n \t\t} catch (RecognitionException re) {\n"}, {"source1": "org/apache/hadoop/hive/ql/parse/HiveParser_AlterClauseParser.java", "source2": "org/apache/hadoop/hive/ql/parse/HiveParser_AlterClauseParser.java", "unified_diff": "@@ -1,8 +1,8 @@\n-// $ANTLR 3.5.2 AlterClauseParser.g 2023-08-07 15:45:12\n+// $ANTLR 3.5.2 AlterClauseParser.g 2025-01-31 11:38:45\n \n package org.apache.hadoop.hive.ql.parse;\n \n import java.util.Arrays;\n import java.util.ArrayList;\n import java.util.Collection;\n import java.util.HashMap;\n@@ -1142,15 +1142,15 @@\n \t\t\t\t\tif ( state.backtracking==0 ) stream_tableName.add(tableName3.getTree());\n \t\t\t\t\tpushFollow(FOLLOW_alterTableStatementSuffix_in_alterStatement72);\n \t\t\t\t\talterTableStatementSuffix4=alterTableStatementSuffix();\n \t\t\t\t\tstate._fsp--;\n \t\t\t\t\tif (state.failed) return retval;\n \t\t\t\t\tif ( state.backtracking==0 ) stream_alterTableStatementSuffix.add(alterTableStatementSuffix4.getTree());\n \t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t// elements: alterTableStatementSuffix, tableName\n+\t\t\t\t\t// elements: tableName, alterTableStatementSuffix\n \t\t\t\t\t// token labels: \n \t\t\t\t\t// rule labels: retval\n \t\t\t\t\t// token list labels: \n \t\t\t\t\t// rule list labels: \n \t\t\t\t\t// wildcard labels: \n \t\t\t\t\tif ( state.backtracking==0 ) {\n \t\t\t\t\tretval.tree = root_0;\n@@ -1965,15 +1965,15 @@\n \t\t\t\t\tif ( state.backtracking==0 ) stream_partitionSpec.add(partitionSpec34.getTree());\n \t\t\t\t\tpushFollow(FOLLOW_alterTblPartitionStatementSuffix_in_alterTableStatementSuffix312);\n \t\t\t\t\talterTblPartitionStatementSuffix35=alterTblPartitionStatementSuffix(true);\n \t\t\t\t\tstate._fsp--;\n \t\t\t\t\tif (state.failed) return retval;\n \t\t\t\t\tif ( state.backtracking==0 ) stream_alterTblPartitionStatementSuffix.add(alterTblPartitionStatementSuffix35.getTree());\n \t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t// elements: partitionSpec, alterTblPartitionStatementSuffix\n+\t\t\t\t\t// elements: alterTblPartitionStatementSuffix, partitionSpec\n \t\t\t\t\t// token labels: \n \t\t\t\t\t// rule labels: retval\n \t\t\t\t\t// token list labels: \n \t\t\t\t\t// rule list labels: \n \t\t\t\t\t// wildcard labels: \n \t\t\t\t\tif ( state.backtracking==0 ) {\n \t\t\t\t\tretval.tree = root_0;\n@@ -3321,15 +3321,15 @@\n \n \t\t\tpushFollow(FOLLOW_dbProperties_in_alterDatabaseSuffixProperties797);\n \t\t\tdbProperties75=gHiveParser.dbProperties();\n \t\t\tstate._fsp--;\n \t\t\tif (state.failed) return retval;\n \t\t\tif ( state.backtracking==0 ) stream_dbProperties.add(dbProperties75.getTree());\n \t\t\t// AST REWRITE\n-\t\t\t// elements: dbProperties, name\n+\t\t\t// elements: name, dbProperties\n \t\t\t// token labels: \n \t\t\t// rule labels: name, retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -3623,15 +3623,15 @@\n \t\t\t\t\tKW_LOCATION80=(Token)match(input,KW_LOCATION,FOLLOW_KW_LOCATION_in_alterDatabaseSuffixSetLocation895); if (state.failed) return retval; \n \t\t\t\t\tif ( state.backtracking==0 ) stream_KW_LOCATION.add(KW_LOCATION80);\n \n \t\t\t\t\tnewLocation=(Token)match(input,StringLiteral,FOLLOW_StringLiteral_in_alterDatabaseSuffixSetLocation899); if (state.failed) return retval; \n \t\t\t\t\tif ( state.backtracking==0 ) stream_StringLiteral.add(newLocation);\n \n \t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t// elements: newLocation, dbName\n+\t\t\t\t\t// elements: dbName, newLocation\n \t\t\t\t\t// token labels: newLocation\n \t\t\t\t\t// rule labels: dbName, retval\n \t\t\t\t\t// token list labels: \n \t\t\t\t\t// rule list labels: \n \t\t\t\t\t// wildcard labels: \n \t\t\t\t\tif ( state.backtracking==0 ) {\n \t\t\t\t\tretval.tree = root_0;\n@@ -4046,15 +4046,15 @@\n \t\t\t\t\tif ( state.backtracking==0 ) stream_restrictOrCascade.add(restrictOrCascade92.getTree());\n \t\t\t\t\t}\n \t\t\t\t\tbreak;\n \n \t\t\t}\n \n \t\t\t// AST REWRITE\n-\t\t\t// elements: restrictOrCascade, columnNameTypeList, restrictOrCascade, columnNameTypeList\n+\t\t\t// elements: columnNameTypeList, restrictOrCascade, columnNameTypeList, restrictOrCascade\n \t\t\t// token labels: \n \t\t\t// rule labels: retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -4255,15 +4255,15 @@\n \t\t\t\t\tif ( state.backtracking==0 ) stream_alterConstraintWithName.add(alterConstraintWithName94.getTree());\n \t\t\t\t\t}\n \t\t\t\t\tbreak;\n \n \t\t\t}\n \n \t\t\t// AST REWRITE\n-\t\t\t// elements: alterForeignKeyWithName, alterConstraintWithName\n+\t\t\t// elements: alterConstraintWithName, alterForeignKeyWithName\n \t\t\t// token labels: \n \t\t\t// rule labels: retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -4685,15 +4685,15 @@\n \t\t\t\t\tif ( state.backtracking==0 ) stream_restrictOrCascade.add(restrictOrCascade106.getTree());\n \t\t\t\t\t}\n \t\t\t\t\tbreak;\n \n \t\t\t}\n \n \t\t\t// AST REWRITE\n-\t\t\t// elements: newName, oldName, comment, alterStatementChangeColPosition, alterColumnConstraint, colType, restrictOrCascade\n+\t\t\t// elements: newName, alterStatementChangeColPosition, colType, oldName, restrictOrCascade, alterColumnConstraint, comment\n \t\t\t// token labels: comment\n \t\t\t// rule labels: newName, oldName, retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -4873,15 +4873,15 @@\n \n \t\t\t\t\t}\n \t\t\t\t\tbreak;\n \n \t\t\t}\n \n \t\t\t// AST REWRITE\n-\t\t\t// elements: colName, comment, colName, tableProperties, tableProperties, comment\n+\t\t\t// elements: comment, tableProperties, colName, tableProperties, colName, comment\n \t\t\t// token labels: comment\n \t\t\t// rule labels: colName, retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -5277,15 +5277,15 @@\n \t\t\t\t\tEarlyExitException eee = new EarlyExitException(24, input);\n \t\t\t\t\tthrow eee;\n \t\t\t\t}\n \t\t\t\tcnt24++;\n \t\t\t}\n \n \t\t\t// AST REWRITE\n-\t\t\t// elements: alterStatementSuffixAddPartitionsElement, alterStatementSuffixAddPartitionsElement, ifNotExists, ifNotExists\n+\t\t\t// elements: alterStatementSuffixAddPartitionsElement, ifNotExists, alterStatementSuffixAddPartitionsElement, ifNotExists\n \t\t\t// token labels: \n \t\t\t// rule labels: retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -6007,15 +6007,15 @@\n \t\t\t\t\tif ( state.backtracking==0 ) stream_replicationClause.add(replicationClause139.getTree());\n \t\t\t\t\t}\n \t\t\t\t\tbreak;\n \n \t\t\t}\n \n \t\t\t// AST REWRITE\n-\t\t\t// elements: replicationClause, partitionSelectorSpec, partitionSelectorSpec, replicationClause, ifExists, ifExists, KW_PURGE\n+\t\t\t// elements: KW_PURGE, replicationClause, ifExists, replicationClause, partitionSelectorSpec, partitionSelectorSpec, ifExists\n \t\t\t// token labels: \n \t\t\t// rule labels: retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -6245,15 +6245,15 @@\n \n \t\t\t\t\tpushFollow(FOLLOW_tableProperties_in_alterStatementSuffixProperties2093);\n \t\t\t\t\ttableProperties146=gHiveParser.tableProperties();\n \t\t\t\t\tstate._fsp--;\n \t\t\t\t\tif (state.failed) return retval;\n \t\t\t\t\tif ( state.backtracking==0 ) stream_tableProperties.add(tableProperties146.getTree());\n \t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t// elements: ifExists, tableProperties\n+\t\t\t\t\t// elements: tableProperties, ifExists\n \t\t\t\t\t// token labels: \n \t\t\t\t\t// rule labels: retval\n \t\t\t\t\t// token list labels: \n \t\t\t\t\t// rule list labels: \n \t\t\t\t\t// wildcard labels: \n \t\t\t\t\tif ( state.backtracking==0 ) {\n \t\t\t\t\tretval.tree = root_0;\n@@ -6435,15 +6435,15 @@\n \n \t\t\t\t\tpushFollow(FOLLOW_tableProperties_in_alterViewSuffixProperties2166);\n \t\t\t\t\ttableProperties153=gHiveParser.tableProperties();\n \t\t\t\t\tstate._fsp--;\n \t\t\t\t\tif (state.failed) return retval;\n \t\t\t\t\tif ( state.backtracking==0 ) stream_tableProperties.add(tableProperties153.getTree());\n \t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t// elements: ifExists, tableProperties\n+\t\t\t\t\t// elements: tableProperties, ifExists\n \t\t\t\t\t// token labels: \n \t\t\t\t\t// rule labels: retval\n \t\t\t\t\t// token list labels: \n \t\t\t\t\t// rule list labels: \n \t\t\t\t\t// wildcard labels: \n \t\t\t\t\tif ( state.backtracking==0 ) {\n \t\t\t\t\tretval.tree = root_0;\n@@ -6617,15 +6617,15 @@\n \t\t\t\t\t\t\tif ( state.backtracking==0 ) stream_tableProperties.add(tableProperties158.getTree());\n \t\t\t\t\t\t\t}\n \t\t\t\t\t\t\tbreak;\n \n \t\t\t\t\t}\n \n \t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t// elements: tableProperties, tableProperties, serdeName, serdeName\n+\t\t\t\t\t// elements: serdeName, tableProperties, tableProperties, serdeName\n \t\t\t\t\t// token labels: serdeName\n \t\t\t\t\t// rule labels: retval\n \t\t\t\t\t// token list labels: \n \t\t\t\t\t// rule list labels: \n \t\t\t\t\t// wildcard labels: \n \t\t\t\t\tif ( state.backtracking==0 ) {\n \t\t\t\t\tretval.tree = root_0;\n@@ -6858,15 +6858,15 @@\n \t\t\t\t\tif ( state.backtracking==0 ) stream_partitionSpec.add(partitionSpec166.getTree());\n \t\t\t\t\t}\n \t\t\t\t\tbreak;\n \n \t\t\t}\n \n \t\t\t// AST REWRITE\n-\t\t\t// elements: partitionSpec, tableName\n+\t\t\t// elements: tableName, partitionSpec\n \t\t\t// token labels: \n \t\t\t// rule labels: retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -7579,15 +7579,15 @@\n \t\t\tEQUAL185=(Token)match(input,EQUAL,FOLLOW_EQUAL_in_skewedLocationMap2707); if (state.failed) return retval; \n \t\t\tif ( state.backtracking==0 ) stream_EQUAL.add(EQUAL185);\n \n \t\t\tvalue=(Token)match(input,StringLiteral,FOLLOW_StringLiteral_in_skewedLocationMap2711); if (state.failed) return retval; \n \t\t\tif ( state.backtracking==0 ) stream_StringLiteral.add(value);\n \n \t\t\t// AST REWRITE\n-\t\t\t// elements: value, key\n+\t\t\t// elements: key, value\n \t\t\t// token labels: value\n \t\t\t// rule labels: key, retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -8000,15 +8000,15 @@\n \n \t\t\tpushFollow(FOLLOW_tableName_in_alterStatementSuffixExchangePartition2887);\n \t\t\texchangename=gHiveParser.tableName();\n \t\t\tstate._fsp--;\n \t\t\tif (state.failed) return retval;\n \t\t\tif ( state.backtracking==0 ) stream_tableName.add(exchangename.getTree());\n \t\t\t// AST REWRITE\n-\t\t\t// elements: exchangename, partitionSpec\n+\t\t\t// elements: partitionSpec, exchangename\n \t\t\t// token labels: \n \t\t\t// rule labels: exchangename, retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -8828,15 +8828,15 @@\n \t\t\t\t\tif ( state.backtracking==0 ) stream_orderByClause.add(orderByClause221.getTree());\n \t\t\t\t\t}\n \t\t\t\t\tbreak;\n \n \t\t\t}\n \n \t\t\t// AST REWRITE\n-\t\t\t// elements: tableProperties, tableImplBuckets, compactPool, compactType, orderByClause, blocking\n+\t\t\t// elements: blocking, tableImplBuckets, orderByClause, tableProperties, compactType, compactPool\n \t\t\t// token labels: compactType\n \t\t\t// rule labels: retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -9450,15 +9450,15 @@\n \n \t\t\t\t\t}\n \n \t\t\t\t\tRPAREN240=(Token)match(input,RPAREN,FOLLOW_RPAREN_in_alterStatementSuffixExecute3513); if (state.failed) return retval; \n \t\t\t\t\tif ( state.backtracking==0 ) stream_RPAREN.add(RPAREN240);\n \n \t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t// elements: expireParam, KW_EXPIRE_SNAPSHOTS\n+\t\t\t\t\t// elements: KW_EXPIRE_SNAPSHOTS, expireParam\n \t\t\t\t\t// token labels: expireParam\n \t\t\t\t\t// rule labels: retval\n \t\t\t\t\t// token list labels: \n \t\t\t\t\t// rule list labels: \n \t\t\t\t\t// wildcard labels: \n \t\t\t\t\tif ( state.backtracking==0 ) {\n \t\t\t\t\tretval.tree = root_0;\n@@ -9623,15 +9623,15 @@\n \n \t\t\tpushFollow(FOLLOW_identifier_in_alterStatementSuffixDropBranch3599);\n \t\t\tbranchName=gHiveParser.identifier();\n \t\t\tstate._fsp--;\n \t\t\tif (state.failed) return retval;\n \t\t\tif ( state.backtracking==0 ) stream_identifier.add(branchName.getTree());\n \t\t\t// AST REWRITE\n-\t\t\t// elements: ifExists, branchName\n+\t\t\t// elements: branchName, ifExists\n \t\t\t// token labels: \n \t\t\t// rule labels: branchName, retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -9788,15 +9788,15 @@\n \t\t\t\t\tif ( state.backtracking==0 ) stream_retentionOfSnapshots.add(retentionOfSnapshots252.getTree());\n \t\t\t\t\t}\n \t\t\t\t\tbreak;\n \n \t\t\t}\n \n \t\t\t// AST REWRITE\n-\t\t\t// elements: retentionOfSnapshots, refRetain, branchName, snapshotIdOfRef\n+\t\t\t// elements: snapshotIdOfRef, retentionOfSnapshots, branchName, refRetain\n \t\t\t// token labels: \n \t\t\t// rule labels: branchName, retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -10391,15 +10391,15 @@\n \t\t\t\t\tif ( state.backtracking==0 ) stream_refRetain.add(refRetain269.getTree());\n \t\t\t\t\t}\n \t\t\t\t\tbreak;\n \n \t\t\t}\n \n \t\t\t// AST REWRITE\n-\t\t\t// elements: tagName, refRetain, snapshotIdOfRef\n+\t\t\t// elements: snapshotIdOfRef, tagName, refRetain\n \t\t\t// token labels: \n \t\t\t// rule labels: tagName, retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -10588,15 +10588,15 @@\n \n \t\t\t\t\t\t\t}\n \t\t\t\t\t\t\tbreak;\n \n \t\t\t\t\t}\n \n \t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t// elements: outFmt, outDriver, inFmt, serdeCls, inDriver\n+\t\t\t\t\t// elements: inDriver, outDriver, inFmt, outFmt, serdeCls\n \t\t\t\t\t// token labels: inFmt, inDriver, outDriver, serdeCls, outFmt\n \t\t\t\t\t// rule labels: retval\n \t\t\t\t\t// token list labels: \n \t\t\t\t\t// rule list labels: \n \t\t\t\t\t// wildcard labels: \n \t\t\t\t\tif ( state.backtracking==0 ) {\n \t\t\t\t\tretval.tree = root_0;\n@@ -10944,15 +10944,15 @@\n \n \t\t\tpushFollow(FOLLOW_dcProperties_in_alterDataConnectorSuffixProperties4131);\n \t\t\tdcProperties280=gHiveParser.dcProperties();\n \t\t\tstate._fsp--;\n \t\t\tif (state.failed) return retval;\n \t\t\tif ( state.backtracking==0 ) stream_dcProperties.add(dcProperties280.getTree());\n \t\t\t// AST REWRITE\n-\t\t\t// elements: dcProperties, name\n+\t\t\t// elements: name, dcProperties\n \t\t\t// token labels: \n \t\t\t// rule labels: name, retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n"}, {"source1": "org/apache/hadoop/hive/ql/parse/HiveParser_CreateDDLParser.java", "source2": "org/apache/hadoop/hive/ql/parse/HiveParser_CreateDDLParser.java", "unified_diff": "@@ -1,8 +1,8 @@\n-// $ANTLR 3.5.2 CreateDDLParser.g 2023-08-07 15:45:14\n+// $ANTLR 3.5.2 CreateDDLParser.g 2025-01-31 11:38:47\n \n package org.apache.hadoop.hive.ql.parse;\n \n import java.util.Arrays;\n import java.util.ArrayList;\n import java.util.Collection;\n import java.util.HashMap;\n@@ -1127,15 +1127,15 @@\n \t\t\t\t\tif ( state.backtracking==0 ) stream_identifier.add(format.getTree());\n \t\t\t\t\turi=(Token)match(input,StringLiteral,FOLLOW_StringLiteral_in_likeTableOrFile84); if (state.failed) return retval; \n \t\t\t\t\tif ( state.backtracking==0 ) stream_StringLiteral.add(uri);\n \n \t\t\t\t\t}\n \n \t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t// elements: format, uri\n+\t\t\t\t\t// elements: uri, format\n \t\t\t\t\t// token labels: uri\n \t\t\t\t\t// rule labels: format, retval\n \t\t\t\t\t// token list labels: \n \t\t\t\t\t// rule list labels: \n \t\t\t\t\t// wildcard labels: \n \t\t\t\t\tif ( state.backtracking==0 ) {\n \t\t\t\t\tretval.tree = root_0;\n@@ -1800,15 +1800,15 @@\n \n \t\t\t\t\t\t\t}\n \t\t\t\t\t\t\tbreak;\n \n \t\t\t\t\t}\n \n \t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t// elements: createTablePartitionSpec, tableLocation, trans, tableBuckets, tableSkewed, tablePropertiesPrefixed, temp, tableFileFormat, likeTableOrFile, ext, selectStatementWithCTE, columnNameTypeOrConstraintList, name, tableRowFormat, ifNotExists, tableComment\n+\t\t\t\t\t// elements: trans, tableBuckets, tableComment, tableRowFormat, tableLocation, likeTableOrFile, selectStatementWithCTE, createTablePartitionSpec, ifNotExists, name, tableSkewed, ext, tablePropertiesPrefixed, tableFileFormat, temp, columnNameTypeOrConstraintList\n \t\t\t\t\t// token labels: ext, temp, trans\n \t\t\t\t\t// rule labels: name, retval\n \t\t\t\t\t// token list labels: \n \t\t\t\t\t// rule list labels: \n \t\t\t\t\t// wildcard labels: \n \t\t\t\t\tif ( state.backtracking==0 ) {\n \t\t\t\t\tretval.tree = root_0;\n@@ -2286,15 +2286,15 @@\n \n \t\t\t\t\t\t\t}\n \t\t\t\t\t\t\tbreak;\n \n \t\t\t\t\t}\n \n \t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t// elements: name, tablePropertiesPrefixed, tableLocation, tableComment, tableBuckets, ifNotExists, tableFileFormat, createTablePartitionSpec, columnNameTypeOrConstraintList, tableSkewed, mgd, likeTableOrFile, tableRowFormat, selectStatementWithCTE\n+\t\t\t\t\t// elements: name, tableComment, tableFileFormat, tableBuckets, likeTableOrFile, tableLocation, tableRowFormat, createTablePartitionSpec, tableSkewed, columnNameTypeOrConstraintList, tablePropertiesPrefixed, ifNotExists, mgd, selectStatementWithCTE\n \t\t\t\t\t// token labels: mgd\n \t\t\t\t\t// rule labels: name, retval\n \t\t\t\t\t// token list labels: \n \t\t\t\t\t// rule list labels: \n \t\t\t\t\t// wildcard labels: \n \t\t\t\t\tif ( state.backtracking==0 ) {\n \t\t\t\t\tretval.tree = root_0;\n@@ -2547,15 +2547,15 @@\n \t\t\t\t\tif ( state.backtracking==0 ) stream_dcProperties.add(dcprops.getTree());\n \t\t\t\t\t}\n \t\t\t\t\tbreak;\n \n \t\t\t}\n \n \t\t\t// AST REWRITE\n-\t\t\t// elements: dataConnectorType, ifNotExists, dataConnectorUrl, dcprops, dataConnectorComment, name\n+\t\t\t// elements: dataConnectorUrl, dcprops, name, ifNotExists, dataConnectorType, dataConnectorComment\n \t\t\t// token labels: \n \t\t\t// rule labels: name, retval, dcprops\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -3038,15 +3038,15 @@\n \n \t\t\tpushFollow(FOLLOW_identifier_in_dropDataConnectorStatement1229);\n \t\t\tidentifier66=gHiveParser.identifier();\n \t\t\tstate._fsp--;\n \t\t\tif (state.failed) return retval;\n \t\t\tif ( state.backtracking==0 ) stream_identifier.add(identifier66.getTree());\n \t\t\t// AST REWRITE\n-\t\t\t// elements: identifier, ifExists\n+\t\t\t// elements: ifExists, identifier\n \t\t\t// token labels: \n \t\t\t// rule labels: retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n"}, {"source1": "org/apache/hadoop/hive/ql/parse/HiveParser_FromClauseParser.java", "source2": "org/apache/hadoop/hive/ql/parse/HiveParser_FromClauseParser.java", "unified_diff": "@@ -1,8 +1,8 @@\n-// $ANTLR 3.5.2 FromClauseParser.g 2023-08-07 15:45:13\n+// $ANTLR 3.5.2 FromClauseParser.g 2025-01-31 11:38:46\n \n package org.apache.hadoop.hive.ql.parse;\n \n import java.util.Arrays;\n import java.util.ArrayList;\n import java.util.Collection;\n import java.util.HashMap;\n@@ -3386,15 +3386,15 @@\n \n \t\t\t\t\t\t\t}\n \t\t\t\t\t\t\tbreak;\n \n \t\t\t\t\t}\n \n \t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t// elements: function, identifier, tableAlias\n+\t\t\t\t\t// elements: identifier, tableAlias, function\n \t\t\t\t\t// token labels: \n \t\t\t\t\t// rule labels: retval\n \t\t\t\t\t// token list labels: \n \t\t\t\t\t// rule list labels: \n \t\t\t\t\t// wildcard labels: \n \t\t\t\t\tif ( state.backtracking==0 ) {\n \t\t\t\t\tretval.tree = root_0;\n@@ -3689,15 +3689,15 @@\n \n \t\t\t\t\t\t\t}\n \t\t\t\t\t\t\tbreak;\n \n \t\t\t\t\t}\n \n \t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t// elements: tableAlias, identifier, valuesClause\n+\t\t\t\t\t// elements: valuesClause, identifier, tableAlias\n \t\t\t\t\t// token labels: \n \t\t\t\t\t// rule labels: retval\n \t\t\t\t\t// token list labels: \n \t\t\t\t\t// rule list labels: \n \t\t\t\t\t// wildcard labels: \n \t\t\t\t\tif ( state.backtracking==0 ) {\n \t\t\t\t\tretval.tree = root_0;\n@@ -3994,24 +3994,24 @@\n \n \t\t\t}\n \n \t\t\tRPAREN114=(Token)match(input,RPAREN,FOLLOW_RPAREN_in_tableBucketSample1292); if (state.failed) return retval; \n \t\t\tif ( state.backtracking==0 ) stream_RPAREN.add(RPAREN114);\n \n \t\t\t// AST REWRITE\n-\t\t\t// elements: numerator, denominator, expr\n-\t\t\t// token labels: numerator, denominator\n+\t\t\t// elements: denominator, numerator, expr\n+\t\t\t// token labels: denominator, numerator\n \t\t\t// rule labels: retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: expr\n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n-\t\t\tRewriteRuleTokenStream stream_numerator=new RewriteRuleTokenStream(adaptor,\"token numerator\",numerator);\n \t\t\tRewriteRuleTokenStream stream_denominator=new RewriteRuleTokenStream(adaptor,\"token denominator\",denominator);\n+\t\t\tRewriteRuleTokenStream stream_numerator=new RewriteRuleTokenStream(adaptor,\"token numerator\",numerator);\n \t\t\tRewriteRuleSubtreeStream stream_retval=new RewriteRuleSubtreeStream(adaptor,\"rule retval\",retval!=null?retval.getTree():null);\n \t\t\tRewriteRuleSubtreeStream stream_expr=new RewriteRuleSubtreeStream(adaptor,\"token expr\",list_expr);\n \t\t\troot_0 = (ASTNode)adaptor.nil();\n \t\t\t// 186:149: -> ^( TOK_TABLEBUCKETSAMPLE $numerator $denominator ( $expr)* )\n \t\t\t{\n \t\t\t\t// FromClauseParser.g:186:152: ^( TOK_TABLEBUCKETSAMPLE $numerator $denominator ( $expr)* )\n \t\t\t\t{\n@@ -4590,15 +4590,15 @@\n \t\t\t\t\tif ( state.backtracking==0 ) stream_identifier.add(alias.getTree());\n \t\t\t\t\t}\n \t\t\t\t\tbreak;\n \n \t\t\t}\n \n \t\t\t// AST REWRITE\n-\t\t\t// elements: asOf, tabname, props, ts, alias\n+\t\t\t// elements: props, ts, asOf, tabname, alias\n \t\t\t// token labels: \n \t\t\t// rule labels: tabname, asOf, alias, retval, props, ts\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -5280,15 +5280,15 @@\n \n \t\t\tpushFollow(FOLLOW_identifier_in_viewName1859);\n \t\t\tview=gHiveParser.identifier();\n \t\t\tstate._fsp--;\n \t\t\tif (state.failed) return retval;\n \t\t\tif ( state.backtracking==0 ) stream_identifier.add(view.getTree());\n \t\t\t// AST REWRITE\n-\t\t\t// elements: view, db\n+\t\t\t// elements: db, view\n \t\t\t// token labels: \n \t\t\t// rule labels: view, db, retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -5554,15 +5554,15 @@\n \t\t\t\t\t\t\tif ( state.backtracking==0 ) stream_orderByClause.add(orderByClause143.getTree());\n \t\t\t\t\t\t\t}\n \t\t\t\t\t\t\tbreak;\n \n \t\t\t\t\t}\n \n \t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t// elements: partitionByClause, orderByClause\n+\t\t\t\t\t// elements: orderByClause, partitionByClause\n \t\t\t\t\t// token labels: \n \t\t\t\t\t// rule labels: retval\n \t\t\t\t\t// token list labels: \n \t\t\t\t\t// rule list labels: \n \t\t\t\t\t// wildcard labels: \n \t\t\t\t\tif ( state.backtracking==0 ) {\n \t\t\t\t\tretval.tree = root_0;\n@@ -5656,15 +5656,15 @@\n \t\t\t\t\t\t\tif ( state.backtracking==0 ) stream_sortByClause.add(sortByClause146.getTree());\n \t\t\t\t\t\t\t}\n \t\t\t\t\t\t\tbreak;\n \n \t\t\t\t\t}\n \n \t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t// elements: sortByClause, distributeByClause\n+\t\t\t\t\t// elements: distributeByClause, sortByClause\n \t\t\t\t\t// token labels: \n \t\t\t\t\t// rule labels: retval\n \t\t\t\t\t// token list labels: \n \t\t\t\t\t// rule list labels: \n \t\t\t\t\t// wildcard labels: \n \t\t\t\t\tif ( state.backtracking==0 ) {\n \t\t\t\t\tretval.tree = root_0;\n@@ -6421,15 +6421,15 @@\n \t\t\t\t\tif ( state.backtracking==0 ) stream_identifier.add(alias.getTree());\n \t\t\t\t\t}\n \t\t\t\t\tbreak;\n \n \t\t\t}\n \n \t\t\t// AST REWRITE\n-\t\t\t// elements: name, alias, ptfsrc, spec, expression\n+\t\t\t// elements: ptfsrc, spec, expression, name, alias\n \t\t\t// token labels: \n \t\t\t// rule labels: ptfsrc, name, alias, spec, retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -7244,15 +7244,15 @@\n \n \t\t\t}\n \n \t\t\tRPAREN189=(Token)match(input,RPAREN,FOLLOW_RPAREN_in_virtualTableSource2677); if (state.failed) return retval; \n \t\t\tif ( state.backtracking==0 ) stream_RPAREN.add(RPAREN189);\n \n \t\t\t// AST REWRITE\n-\t\t\t// elements: valuesClause, identifier\n+\t\t\t// elements: identifier, valuesClause\n \t\t\t// token labels: \n \t\t\t// rule labels: retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n"}, {"source1": "org/apache/hadoop/hive/ql/parse/HiveParser_IdentifiersParser.java", "source2": "org/apache/hadoop/hive/ql/parse/HiveParser_IdentifiersParser.java", "unified_diff": "@@ -1,8 +1,8 @@\n-// $ANTLR 3.5.2 IdentifiersParser.g 2023-08-07 15:45:14\n+// $ANTLR 3.5.2 IdentifiersParser.g 2025-01-31 11:38:47\n \n package org.apache.hadoop.hive.ql.parse;\n \n import java.util.Arrays;\n import java.util.ArrayList;\n import java.util.Collection;\n import java.util.HashMap;\n@@ -2725,15 +2725,15 @@\n \t\t\t\tdefault :\n \t\t\t\t\tbreak loop15;\n \t\t\t\t}\n \t\t\t}\n \n \t\t\tif ( state.backtracking==0 ) { incAliasCounter(); }\n \t\t\t// AST REWRITE\n-\t\t\t// elements: expressionWithAlias, expressionWithAlias, identifier\n+\t\t\t// elements: identifier, expressionWithAlias, expressionWithAlias\n \t\t\t// token labels: \n \t\t\t// rule labels: retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -2895,15 +2895,15 @@\n \t\t\t\t\t}\n \t\t\t\t\tbreak;\n \n \t\t\t}\n \n \t\t\tif ( state.backtracking==0 ) { incAliasCounter(); }\n \t\t\t// AST REWRITE\n-\t\t\t// elements: expression, expression, identifier\n+\t\t\t// elements: expression, identifier, expression\n \t\t\t// token labels: \n \t\t\t// rule labels: retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -3958,15 +3958,15 @@\n \t\t\tif ( state.backtracking==0 ) stream_selectExpression.add(str.getTree());\n \t\t\t}\n \n \t\t\tRPAREN79=(Token)match(input,RPAREN,FOLLOW_RPAREN_in_trimFunction1438); if (state.failed) return retval; \n \t\t\tif ( state.backtracking==0 ) stream_RPAREN.add(RPAREN79);\n \n \t\t\t// AST REWRITE\n-\t\t\t// elements: str, str, str, trim_characters, trim_characters, trim_characters\n+\t\t\t// elements: trim_characters, str, str, trim_characters, str, trim_characters\n \t\t\t// token labels: \n \t\t\t// rule labels: str, trim_characters, retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -4777,15 +4777,15 @@\n \n \t\t\t\t\t\t\t}\n \t\t\t\t\t\t\tbreak;\n \n \t\t\t\t\t}\n \n \t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t// elements: functionName, functionName, ws, ordBy, functionName, ws, selectExpression, functionName, selectExpression, ws, selectExpression\n+\t\t\t\t\t// elements: ws, functionName, functionName, selectExpression, ws, functionName, selectExpression, selectExpression, functionName, ws, ordBy\n \t\t\t\t\t// token labels: \n \t\t\t\t\t// rule labels: ordBy, ws, retval\n \t\t\t\t\t// token list labels: \n \t\t\t\t\t// rule list labels: \n \t\t\t\t\t// wildcard labels: \n \t\t\t\t\tif ( state.backtracking==0 ) {\n \t\t\t\t\tretval.tree = root_0;\n@@ -5263,15 +5263,15 @@\n \n \t\t\t}\n \n \t\t\tRPAREN107=(Token)match(input,RPAREN,FOLLOW_RPAREN_in_castExpression2132); if (state.failed) return retval; \n \t\t\tif ( state.backtracking==0 ) stream_RPAREN.add(RPAREN107);\n \n \t\t\t// AST REWRITE\n-\t\t\t// elements: toType, StringLiteral, expression, expression, expression, StringLiteral\n+\t\t\t// elements: StringLiteral, toType, expression, expression, StringLiteral, expression\n \t\t\t// token labels: \n \t\t\t// rule labels: toType, retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -5634,15 +5634,15 @@\n \n \t\t\t}\n \n \t\t\tKW_END124=(Token)match(input,KW_END,FOLLOW_KW_END_in_whenExpression2357); if (state.failed) return retval; \n \t\t\tif ( state.backtracking==0 ) stream_KW_END.add(KW_END124);\n \n \t\t\t// AST REWRITE\n-\t\t\t// elements: KW_WHEN, expression\n+\t\t\t// elements: expression, KW_WHEN\n \t\t\t// token labels: \n \t\t\t// rule labels: retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -5770,15 +5770,15 @@\n \n \t\t\t}\n \n \t\t\tRPAREN129=(Token)match(input,RPAREN,FOLLOW_RPAREN_in_floorExpression2444); if (state.failed) return retval; \n \t\t\tif ( state.backtracking==0 ) stream_RPAREN.add(RPAREN129);\n \n \t\t\t// AST REWRITE\n-\t\t\t// elements: expression, floorUnit, expression\n+\t\t\t// elements: floorUnit, expression, expression\n \t\t\t// token labels: \n \t\t\t// rule labels: retval, floorUnit\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -6245,15 +6245,15 @@\n \t\t\tstate._fsp--;\n \t\t\tif (state.failed) return retval;\n \t\t\tif ( state.backtracking==0 ) stream_expression.add(expression141.getTree());\n \t\t\tRPAREN142=(Token)match(input,RPAREN,FOLLOW_RPAREN_in_extractExpression2666); if (state.failed) return retval; \n \t\t\tif ( state.backtracking==0 ) stream_RPAREN.add(RPAREN142);\n \n \t\t\t// AST REWRITE\n-\t\t\t// elements: expression, timeUnit\n+\t\t\t// elements: timeUnit, expression\n \t\t\t// token labels: \n \t\t\t// rule labels: retval, timeUnit\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -7396,15 +7396,15 @@\n \t\t\tcsName=(Token)match(input,CharSetName,FOLLOW_CharSetName_in_charSetStringLiteral3118); if (state.failed) return retval; \n \t\t\tif ( state.backtracking==0 ) stream_CharSetName.add(csName);\n \n \t\t\tcsLiteral=(Token)match(input,CharSetLiteral,FOLLOW_CharSetLiteral_in_charSetStringLiteral3122); if (state.failed) return retval; \n \t\t\tif ( state.backtracking==0 ) stream_CharSetLiteral.add(csLiteral);\n \n \t\t\t// AST REWRITE\n-\t\t\t// elements: csLiteral, csName\n+\t\t\t// elements: csName, csLiteral\n \t\t\t// token labels: csName, csLiteral\n \t\t\t// rule labels: retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -10533,15 +10533,15 @@\n \t\t\t\t\tequalExpr=precedenceBitwiseOrExpression();\n \t\t\t\t\tstate._fsp--;\n \t\t\t\t\tif (state.failed) return retval;\n \t\t\t\t\tif ( state.backtracking==0 ) stream_precedenceBitwiseOrExpression.add(equalExpr.getTree());\n \t\t\t\t\t}\n \n \t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t// elements: precedenceSimilarOperator, equalExpr\n+\t\t\t\t\t// elements: equalExpr, precedenceSimilarOperator\n \t\t\t\t\t// token labels: \n \t\t\t\t\t// rule labels: retval, equalExpr\n \t\t\t\t\t// token list labels: \n \t\t\t\t\t// rule list labels: \n \t\t\t\t\t// wildcard labels: \n \t\t\t\t\tif ( state.backtracking==0 ) {\n \t\t\t\t\tretval.tree = root_0;\n@@ -10769,15 +10769,15 @@\n \t\t\t\t\tmax=precedenceBitwiseOrExpression();\n \t\t\t\t\tstate._fsp--;\n \t\t\t\t\tif (state.failed) return retval;\n \t\t\t\t\tif ( state.backtracking==0 ) stream_precedenceBitwiseOrExpression.add(max.getTree());\n \t\t\t\t\t}\n \n \t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t// elements: max, min\n+\t\t\t\t\t// elements: min, max\n \t\t\t\t\t// token labels: \n \t\t\t\t\t// rule labels: min, max, retval\n \t\t\t\t\t// token list labels: \n \t\t\t\t\t// rule list labels: \n \t\t\t\t\t// wildcard labels: \n \t\t\t\t\tif ( state.backtracking==0 ) {\n \t\t\t\t\tretval.tree = root_0;\n@@ -11430,15 +11430,15 @@\n \t\t\t\t\tif ( state.backtracking==0 ) stream_precedenceRegexpOperator.add(precedenceRegexpOperator274.getTree());\n \t\t\t\t\tpushFollow(FOLLOW_precedenceBitwiseOrExpression_in_precedenceSimilarExpressionPartNot4900);\n \t\t\t\t\tnotExpr=precedenceBitwiseOrExpression();\n \t\t\t\t\tstate._fsp--;\n \t\t\t\t\tif (state.failed) return retval;\n \t\t\t\t\tif ( state.backtracking==0 ) stream_precedenceBitwiseOrExpression.add(notExpr.getTree());\n \t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t// elements: precedenceRegexpOperator, notExpr\n+\t\t\t\t\t// elements: notExpr, precedenceRegexpOperator\n \t\t\t\t\t// token labels: \n \t\t\t\t\t// rule labels: notExpr, retval\n \t\t\t\t\t// token list labels: \n \t\t\t\t\t// rule list labels: \n \t\t\t\t\t// wildcard labels: \n \t\t\t\t\tif ( state.backtracking==0 ) {\n \t\t\t\t\tretval.tree = root_0;\n@@ -12384,15 +12384,15 @@\n \t\t\t\t\tif ( state.backtracking==0 ) stream_isCondition.add(isCondition300.getTree());\n \t\t\t\t\t}\n \t\t\t\t\tbreak;\n \n \t\t\t}\n \n \t\t\t// AST REWRITE\n-\t\t\t// elements: isCondition, precedenceEqualExpression, precedenceEqualExpression\n+\t\t\t// elements: precedenceEqualExpression, precedenceEqualExpression, isCondition\n \t\t\t// token labels: \n \t\t\t// rule labels: retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -13118,15 +13118,15 @@\n \t\t\t\t\tif ( state.backtracking==0 ) stream_partitionSpec.add(partitionSpec317.getTree());\n \t\t\t\t\t}\n \t\t\t\t\tbreak;\n \n \t\t\t}\n \n \t\t\t// AST REWRITE\n-\t\t\t// elements: tableName, partitionSpec\n+\t\t\t// elements: partitionSpec, tableName\n \t\t\t// token labels: \n \t\t\t// rule labels: retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -13594,15 +13594,15 @@\n \t\t\tif ( state.backtracking==0 ) stream_partitionSelectorOperator.add(partitionSelectorOperator333.getTree());\n \t\t\tpushFollow(FOLLOW_constant_in_partitionSelectorVal5709);\n \t\t\tconstant334=constant();\n \t\t\tstate._fsp--;\n \t\t\tif (state.failed) return retval;\n \t\t\tif ( state.backtracking==0 ) stream_constant.add(constant334.getTree());\n \t\t\t// AST REWRITE\n-\t\t\t// elements: identifier, constant, partitionSelectorOperator\n+\t\t\t// elements: partitionSelectorOperator, constant, identifier\n \t\t\t// token labels: \n \t\t\t// rule labels: retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n"}, {"source1": "org/apache/hadoop/hive/ql/parse/HiveParser_PrepareStatementParser.java", "source2": "org/apache/hadoop/hive/ql/parse/HiveParser_PrepareStatementParser.java", "unified_diff": "@@ -1,8 +1,8 @@\n-// $ANTLR 3.5.2 PrepareStatementParser.g 2023-08-07 15:45:14\n+// $ANTLR 3.5.2 PrepareStatementParser.g 2025-01-31 11:38:47\n \n package org.apache.hadoop.hive.ql.parse;\n \n import java.util.Arrays;\n import java.util.ArrayList;\n import java.util.Collection;\n import java.util.HashMap;\n@@ -1049,15 +1049,15 @@\n \n \t\t\tpushFollow(FOLLOW_queryStatementExpression_in_prepareStatement72);\n \t\t\tqueryStatementExpression4=gHiveParser.queryStatementExpression();\n \t\t\tstate._fsp--;\n \n \t\t\tstream_queryStatementExpression.add(queryStatementExpression4.getTree());\n \t\t\t// AST REWRITE\n-\t\t\t// elements: identifier, queryStatementExpression\n+\t\t\t// elements: queryStatementExpression, identifier\n \t\t\t// token labels: \n \t\t\t// rule labels: retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tretval.tree = root_0;\n \t\t\tRewriteRuleSubtreeStream stream_retval=new RewriteRuleSubtreeStream(adaptor,\"rule retval\",retval!=null?retval.getTree():null);\n"}, {"source1": "org/apache/hadoop/hive/ql/parse/HiveParser_ResourcePlanParser.java", "source2": "org/apache/hadoop/hive/ql/parse/HiveParser_ResourcePlanParser.java", "unified_diff": "@@ -1,8 +1,8 @@\n-// $ANTLR 3.5.2 ResourcePlanParser.g 2023-08-07 15:45:14\n+// $ANTLR 3.5.2 ResourcePlanParser.g 2025-01-31 11:38:47\n \n package org.apache.hadoop.hive.ql.parse;\n \n import java.util.Arrays;\n import java.util.ArrayList;\n import java.util.Collection;\n import java.util.HashMap;\n@@ -2083,15 +2083,15 @@\n \n \t\t\t\t\tpushFollow(FOLLOW_identifier_in_createResourcePlanStatement435);\n \t\t\t\t\tlikeName=gHiveParser.identifier();\n \t\t\t\t\tstate._fsp--;\n \n \t\t\t\t\tstream_identifier.add(likeName.getTree());\n \t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t// elements: likeName, name, ifNotExists\n+\t\t\t\t\t// elements: name, likeName, ifNotExists\n \t\t\t\t\t// token labels: \n \t\t\t\t\t// rule labels: likeName, name, retval\n \t\t\t\t\t// token list labels: \n \t\t\t\t\t// rule list labels: \n \t\t\t\t\t// wildcard labels: \n \t\t\t\t\tretval.tree = root_0;\n \t\t\t\t\tRewriteRuleSubtreeStream stream_likeName=new RewriteRuleSubtreeStream(adaptor,\"rule likeName\",likeName!=null?likeName.getTree():null);\n@@ -2163,15 +2163,15 @@\n \t\t\t\t\t\t\tstream_rpAssignList.add(rpAssignList36.getTree());\n \t\t\t\t\t\t\t}\n \t\t\t\t\t\t\tbreak;\n \n \t\t\t\t\t}\n \n \t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t// elements: ifNotExists, name, rpAssignList\n+\t\t\t\t\t// elements: rpAssignList, ifNotExists, name\n \t\t\t\t\t// token labels: \n \t\t\t\t\t// rule labels: name, retval\n \t\t\t\t\t// token list labels: \n \t\t\t\t\t// rule list labels: \n \t\t\t\t\t// wildcard labels: \n \t\t\t\t\tretval.tree = root_0;\n \t\t\t\t\tRewriteRuleSubtreeStream stream_name=new RewriteRuleSubtreeStream(adaptor,\"rule name\",name!=null?name.getTree():null);\n@@ -2935,15 +2935,15 @@\n \n \t\t\t\t\tpushFollow(FOLLOW_identifier_in_alterResourcePlanStatement733);\n \t\t\t\t\tnewName=gHiveParser.identifier();\n \t\t\t\t\tstate._fsp--;\n \n \t\t\t\t\tstream_identifier.add(newName.getTree());\n \t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t// elements: name, newName\n+\t\t\t\t\t// elements: newName, name\n \t\t\t\t\t// token labels: \n \t\t\t\t\t// rule labels: newName, name, retval\n \t\t\t\t\t// token list labels: \n \t\t\t\t\t// rule list labels: \n \t\t\t\t\t// wildcard labels: \n \t\t\t\t\tretval.tree = root_0;\n \t\t\t\t\tRewriteRuleSubtreeStream stream_newName=new RewriteRuleSubtreeStream(adaptor,\"rule newName\",newName!=null?newName.getTree():null);\n@@ -3054,15 +3054,15 @@\n \n \t\t\t\t\t\t\t}\n \t\t\t\t\t\t\tbreak;\n \n \t\t\t\t\t}\n \n \t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t// elements: name, enable, activate\n+\t\t\t\t\t// elements: activate, name, enable\n \t\t\t\t\t// token labels: \n \t\t\t\t\t// rule labels: name, retval\n \t\t\t\t\t// token list labels: \n \t\t\t\t\t// rule list labels: \n \t\t\t\t\t// wildcard labels: \n \t\t\t\t\tretval.tree = root_0;\n \t\t\t\t\tRewriteRuleSubtreeStream stream_name=new RewriteRuleSubtreeStream(adaptor,\"rule name\",name!=null?name.getTree():null);\n@@ -4417,15 +4417,15 @@\n \n \t\t\tpushFollow(FOLLOW_triggerActionExpression_in_createTriggerStatement1362);\n \t\t\ttriggerActionExpression106=triggerActionExpression();\n \t\t\tstate._fsp--;\n \n \t\t\tstream_triggerActionExpression.add(triggerActionExpression106.getTree());\n \t\t\t// AST REWRITE\n-\t\t\t// elements: triggerActionExpression, rpName, triggerName, triggerExpression\n+\t\t\t// elements: triggerName, triggerActionExpression, rpName, triggerExpression\n \t\t\t// token labels: \n \t\t\t// rule labels: triggerName, rpName, retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tretval.tree = root_0;\n \t\t\tRewriteRuleSubtreeStream stream_triggerName=new RewriteRuleSubtreeStream(adaptor,\"rule triggerName\",triggerName!=null?triggerName.getTree():null);\n@@ -4685,15 +4685,15 @@\n \n \t\t\t\t\tpushFollow(FOLLOW_triggerActionExpression_in_alterTriggerStatement1440);\n \t\t\t\t\ttriggerActionExpression113=triggerActionExpression();\n \t\t\t\t\tstate._fsp--;\n \n \t\t\t\t\tstream_triggerActionExpression.add(triggerActionExpression113.getTree());\n \t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t// elements: rpName, triggerName, triggerExpression, triggerActionExpression\n+\t\t\t\t\t// elements: rpName, triggerExpression, triggerActionExpression, triggerName\n \t\t\t\t\t// token labels: \n \t\t\t\t\t// rule labels: triggerName, rpName, retval\n \t\t\t\t\t// token list labels: \n \t\t\t\t\t// rule list labels: \n \t\t\t\t\t// wildcard labels: \n \t\t\t\t\tretval.tree = root_0;\n \t\t\t\t\tRewriteRuleSubtreeStream stream_triggerName=new RewriteRuleSubtreeStream(adaptor,\"rule triggerName\",triggerName!=null?triggerName.getTree():null);\n@@ -4740,15 +4740,15 @@\n \n \t\t\t\t\tpushFollow(FOLLOW_poolPath_in_alterTriggerStatement1486);\n \t\t\t\t\tpoolName=poolPath();\n \t\t\t\t\tstate._fsp--;\n \n \t\t\t\t\tstream_poolPath.add(poolName.getTree());\n \t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t// elements: rpName, poolName, triggerName\n+\t\t\t\t\t// elements: triggerName, poolName, rpName\n \t\t\t\t\t// token labels: \n \t\t\t\t\t// rule labels: triggerName, rpName, retval, poolName\n \t\t\t\t\t// token list labels: \n \t\t\t\t\t// rule list labels: \n \t\t\t\t\t// wildcard labels: \n \t\t\t\t\tretval.tree = root_0;\n \t\t\t\t\tRewriteRuleSubtreeStream stream_triggerName=new RewriteRuleSubtreeStream(adaptor,\"rule triggerName\",triggerName!=null?triggerName.getTree():null);\n@@ -4795,15 +4795,15 @@\n \n \t\t\t\t\tpushFollow(FOLLOW_poolPath_in_alterTriggerStatement1521);\n \t\t\t\t\tpoolName=poolPath();\n \t\t\t\t\tstate._fsp--;\n \n \t\t\t\t\tstream_poolPath.add(poolName.getTree());\n \t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t// elements: rpName, triggerName, poolName\n+\t\t\t\t\t// elements: rpName, poolName, triggerName\n \t\t\t\t\t// token labels: \n \t\t\t\t\t// rule labels: triggerName, rpName, retval, poolName\n \t\t\t\t\t// token list labels: \n \t\t\t\t\t// rule list labels: \n \t\t\t\t\t// wildcard labels: \n \t\t\t\t\tretval.tree = root_0;\n \t\t\t\t\tRewriteRuleSubtreeStream stream_triggerName=new RewriteRuleSubtreeStream(adaptor,\"rule triggerName\",triggerName!=null?triggerName.getTree():null);\n@@ -4845,15 +4845,15 @@\n \t\t\t\t\tKW_TO121=(Token)match(input,KW_TO,FOLLOW_KW_TO_in_alterTriggerStatement1551);  \n \t\t\t\t\tstream_KW_TO.add(KW_TO121);\n \n \t\t\t\t\tKW_UNMANAGED122=(Token)match(input,KW_UNMANAGED,FOLLOW_KW_UNMANAGED_in_alterTriggerStatement1553);  \n \t\t\t\t\tstream_KW_UNMANAGED.add(KW_UNMANAGED122);\n \n \t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t// elements: triggerName, rpName\n+\t\t\t\t\t// elements: rpName, triggerName\n \t\t\t\t\t// token labels: \n \t\t\t\t\t// rule labels: triggerName, rpName, retval\n \t\t\t\t\t// token list labels: \n \t\t\t\t\t// rule list labels: \n \t\t\t\t\t// wildcard labels: \n \t\t\t\t\tretval.tree = root_0;\n \t\t\t\t\tRewriteRuleSubtreeStream stream_triggerName=new RewriteRuleSubtreeStream(adaptor,\"rule triggerName\",triggerName!=null?triggerName.getTree():null);\n@@ -4894,15 +4894,15 @@\n \t\t\t\t\tKW_FROM124=(Token)match(input,KW_FROM,FOLLOW_KW_FROM_in_alterTriggerStatement1581);  \n \t\t\t\t\tstream_KW_FROM.add(KW_FROM124);\n \n \t\t\t\t\tKW_UNMANAGED125=(Token)match(input,KW_UNMANAGED,FOLLOW_KW_UNMANAGED_in_alterTriggerStatement1583);  \n \t\t\t\t\tstream_KW_UNMANAGED.add(KW_UNMANAGED125);\n \n \t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t// elements: triggerName, rpName\n+\t\t\t\t\t// elements: rpName, triggerName\n \t\t\t\t\t// token labels: \n \t\t\t\t\t// rule labels: triggerName, rpName, retval\n \t\t\t\t\t// token list labels: \n \t\t\t\t\t// rule list labels: \n \t\t\t\t\t// wildcard labels: \n \t\t\t\t\tretval.tree = root_0;\n \t\t\t\t\tRewriteRuleSubtreeStream stream_triggerName=new RewriteRuleSubtreeStream(adaptor,\"rule triggerName\",triggerName!=null?triggerName.getTree():null);\n@@ -5698,15 +5698,15 @@\n \n \t\t\t\t\tpushFollow(FOLLOW_poolAssignList_in_alterPoolStatement1958);\n \t\t\t\t\tpoolAssignList151=poolAssignList();\n \t\t\t\t\tstate._fsp--;\n \n \t\t\t\t\tstream_poolAssignList.add(poolAssignList151.getTree());\n \t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t// elements: rpName, poolPath, poolAssignList\n+\t\t\t\t\t// elements: poolAssignList, poolPath, rpName\n \t\t\t\t\t// token labels: \n \t\t\t\t\t// rule labels: rpName, retval\n \t\t\t\t\t// token list labels: \n \t\t\t\t\t// rule list labels: \n \t\t\t\t\t// wildcard labels: \n \t\t\t\t\tretval.tree = root_0;\n \t\t\t\t\tRewriteRuleSubtreeStream stream_rpName=new RewriteRuleSubtreeStream(adaptor,\"rule rpName\",rpName!=null?rpName.getTree():null);\n@@ -5743,15 +5743,15 @@\n \t\t\t\t\tKW_UNSET152=(Token)match(input,KW_UNSET,FOLLOW_KW_UNSET_in_alterPoolStatement1985);  \n \t\t\t\t\tstream_KW_UNSET.add(KW_UNSET152);\n \n \t\t\t\t\tKW_SCHEDULING_POLICY153=(Token)match(input,KW_SCHEDULING_POLICY,FOLLOW_KW_SCHEDULING_POLICY_in_alterPoolStatement1987);  \n \t\t\t\t\tstream_KW_SCHEDULING_POLICY.add(KW_SCHEDULING_POLICY153);\n \n \t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t// elements: rpName, poolPath\n+\t\t\t\t\t// elements: poolPath, rpName\n \t\t\t\t\t// token labels: \n \t\t\t\t\t// rule labels: rpName, retval\n \t\t\t\t\t// token list labels: \n \t\t\t\t\t// rule list labels: \n \t\t\t\t\t// wildcard labels: \n \t\t\t\t\tretval.tree = root_0;\n \t\t\t\t\tRewriteRuleSubtreeStream stream_rpName=new RewriteRuleSubtreeStream(adaptor,\"rule rpName\",rpName!=null?rpName.getTree():null);\n@@ -5800,15 +5800,15 @@\n \n \t\t\t\t\tpushFollow(FOLLOW_identifier_in_alterPoolStatement2024);\n \t\t\t\t\ttriggerName=gHiveParser.identifier();\n \t\t\t\t\tstate._fsp--;\n \n \t\t\t\t\tstream_identifier.add(triggerName.getTree());\n \t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t// elements: poolPath, triggerName, rpName\n+\t\t\t\t\t// elements: poolPath, rpName, triggerName\n \t\t\t\t\t// token labels: \n \t\t\t\t\t// rule labels: triggerName, rpName, retval\n \t\t\t\t\t// token list labels: \n \t\t\t\t\t// rule list labels: \n \t\t\t\t\t// wildcard labels: \n \t\t\t\t\tretval.tree = root_0;\n \t\t\t\t\tRewriteRuleSubtreeStream stream_triggerName=new RewriteRuleSubtreeStream(adaptor,\"rule triggerName\",triggerName!=null?triggerName.getTree():null);\n@@ -5851,15 +5851,15 @@\n \n \t\t\t\t\tpushFollow(FOLLOW_identifier_in_alterPoolStatement2058);\n \t\t\t\t\ttriggerName=gHiveParser.identifier();\n \t\t\t\t\tstate._fsp--;\n \n \t\t\t\t\tstream_identifier.add(triggerName.getTree());\n \t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t// elements: triggerName, poolPath, rpName\n+\t\t\t\t\t// elements: poolPath, triggerName, rpName\n \t\t\t\t\t// token labels: \n \t\t\t\t\t// rule labels: triggerName, rpName, retval\n \t\t\t\t\t// token list labels: \n \t\t\t\t\t// rule list labels: \n \t\t\t\t\t// wildcard labels: \n \t\t\t\t\tretval.tree = root_0;\n \t\t\t\t\tRewriteRuleSubtreeStream stream_triggerName=new RewriteRuleSubtreeStream(adaptor,\"rule triggerName\",triggerName!=null?triggerName.getTree():null);\n@@ -5964,15 +5964,15 @@\n \n \t\t\tpushFollow(FOLLOW_poolPath_in_dropPoolStatement2118);\n \t\t\tpoolPath161=poolPath();\n \t\t\tstate._fsp--;\n \n \t\t\tstream_poolPath.add(poolPath161.getTree());\n \t\t\t// AST REWRITE\n-\t\t\t// elements: rpName, poolPath\n+\t\t\t// elements: poolPath, rpName\n \t\t\t// token labels: \n \t\t\t// rule labels: rpName, retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tretval.tree = root_0;\n \t\t\tRewriteRuleSubtreeStream stream_rpName=new RewriteRuleSubtreeStream(adaptor,\"rule rpName\",rpName!=null?rpName.getTree():null);\n@@ -6216,15 +6216,15 @@\n \t\t\t\t\tbreak;\n \n \t\t\t}\n \n \t\t\t}\n \n \t\t\t// AST REWRITE\n-\t\t\t// elements: path, order, rpName, name, mappingType, unmanaged\n+\t\t\t// elements: rpName, path, order, mappingType, unmanaged, name\n \t\t\t// token labels: mappingType, name, order\n \t\t\t// rule labels: path, rpName, retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tretval.tree = root_0;\n \t\t\tRewriteRuleTokenStream stream_mappingType=new RewriteRuleTokenStream(adaptor,\"token mappingType\",mappingType);\n@@ -6491,15 +6491,15 @@\n \t\t\t\t\tbreak;\n \n \t\t\t}\n \n \t\t\t}\n \n \t\t\t// AST REWRITE\n-\t\t\t// elements: order, path, rpName, mappingType, name, unmanaged\n+\t\t\t// elements: rpName, unmanaged, order, path, mappingType, name\n \t\t\t// token labels: mappingType, name, order\n \t\t\t// rule labels: path, rpName, retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tretval.tree = root_0;\n \t\t\tRewriteRuleTokenStream stream_mappingType=new RewriteRuleTokenStream(adaptor,\"token mappingType\",mappingType);\n@@ -6673,15 +6673,15 @@\n \n \t\t\tpushFollow(FOLLOW_identifier_in_dropMappingStatement2469);\n \t\t\trpName=gHiveParser.identifier();\n \t\t\tstate._fsp--;\n \n \t\t\tstream_identifier.add(rpName.getTree());\n \t\t\t// AST REWRITE\n-\t\t\t// elements: rpName, name, mappingType\n+\t\t\t// elements: rpName, mappingType, name\n \t\t\t// token labels: mappingType, name\n \t\t\t// rule labels: rpName, retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tretval.tree = root_0;\n \t\t\tRewriteRuleTokenStream stream_mappingType=new RewriteRuleTokenStream(adaptor,\"token mappingType\",mappingType);\n"}, {"source1": "org/apache/hadoop/hive/ql/parse/HiveParser_SelectClauseParser.java", "source2": "org/apache/hadoop/hive/ql/parse/HiveParser_SelectClauseParser.java", "unified_diff": "@@ -1,8 +1,8 @@\n-// $ANTLR 3.5.2 SelectClauseParser.g 2023-08-07 15:45:13\n+// $ANTLR 3.5.2 SelectClauseParser.g 2025-01-31 11:38:45\n \n package org.apache.hadoop.hive.ql.parse;\n \n import java.util.Arrays;\n import java.util.ArrayList;\n import java.util.Collection;\n import java.util.HashMap;\n@@ -1168,15 +1168,15 @@\n \n \t\t\t\t\t\t\t}\n \t\t\t\t\t\t\tbreak;\n \n \t\t\t\t\t}\n \n \t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t// elements: QUERY_HINT, selectList, QUERY_HINT, selectList, selectTrfmClause, QUERY_HINT\n+\t\t\t\t\t// elements: selectList, QUERY_HINT, selectList, selectTrfmClause, QUERY_HINT, QUERY_HINT\n \t\t\t\t\t// token labels: \n \t\t\t\t\t// rule labels: retval\n \t\t\t\t\t// token list labels: \n \t\t\t\t\t// rule list labels: \n \t\t\t\t\t// wildcard labels: \n \t\t\t\t\tif ( state.backtracking==0 ) {\n \t\t\t\t\tretval.tree = root_0;\n@@ -1693,15 +1693,15 @@\n \t\t\tif ( state.backtracking==0 ) stream_rowFormat.add(outSerde.getTree());\n \t\t\tpushFollow(FOLLOW_recordReader_in_selectTrfmClause357);\n \t\t\toutRec=gHiveParser.recordReader();\n \t\t\tstate._fsp--;\n \t\t\tif (state.failed) return retval;\n \t\t\tif ( state.backtracking==0 ) stream_recordReader.add(outRec.getTree());\n \t\t\t// AST REWRITE\n-\t\t\t// elements: outRec, StringLiteral, inRec, columnNameTypeList, outSerde, aliasList, inSerde, selectExpressionList\n+\t\t\t// elements: outSerde, inRec, selectExpressionList, inSerde, outRec, columnNameTypeList, StringLiteral, aliasList\n \t\t\t// token labels: \n \t\t\t// rule labels: inRec, outRec, inSerde, outSerde, retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -2313,15 +2313,15 @@\n \t\t\tif ( state.backtracking==0 ) stream_rowFormat.add(outSerde.getTree());\n \t\t\tpushFollow(FOLLOW_recordReader_in_trfmClause622);\n \t\t\toutRec=gHiveParser.recordReader();\n \t\t\tstate._fsp--;\n \t\t\tif (state.failed) return retval;\n \t\t\tif ( state.backtracking==0 ) stream_recordReader.add(outRec.getTree());\n \t\t\t// AST REWRITE\n-\t\t\t// elements: StringLiteral, aliasList, inSerde, outRec, columnNameTypeList, selectExpressionList, outSerde, inRec\n+\t\t\t// elements: inRec, columnNameTypeList, selectExpressionList, aliasList, outSerde, inSerde, StringLiteral, outRec\n \t\t\t// token labels: \n \t\t\t// rule labels: inRec, outRec, inSerde, outSerde, retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -2649,15 +2649,15 @@\n \n \t\t\t\tdefault :\n \t\t\t\t\tbreak loop21;\n \t\t\t\t}\n \t\t\t}\n \n \t\t\t// AST REWRITE\n-\t\t\t// elements: window_defn, KW_WINDOW\n+\t\t\t// elements: KW_WINDOW, window_defn\n \t\t\t// token labels: \n \t\t\t// rule labels: retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -2749,15 +2749,15 @@\n \n \t\t\tpushFollow(FOLLOW_window_specification_in_window_defn825);\n \t\t\twindow_specification56=window_specification(null);\n \t\t\tstate._fsp--;\n \t\t\tif (state.failed) return retval;\n \t\t\tif ( state.backtracking==0 ) stream_window_specification.add(window_specification56.getTree());\n \t\t\t// AST REWRITE\n-\t\t\t// elements: window_specification, identifier\n+\t\t\t// elements: identifier, window_specification\n \t\t\t// token labels: \n \t\t\t// rule labels: retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -3202,15 +3202,15 @@\n \n \t\t\t\t\t}\n \t\t\t\t\tbreak;\n \n \t\t\t}\n \n \t\t\t// AST REWRITE\n-\t\t\t// elements: identifier, window_frame, partitioningSpec\n+\t\t\t// elements: partitioningSpec, window_frame, identifier\n \t\t\t// token labels: \n \t\t\t// rule labels: retval\n \t\t\t// token list labels: \n \t\t\t// rule list labels: \n \t\t\t// wildcard labels: \n \t\t\tif ( state.backtracking==0 ) {\n \t\t\tretval.tree = root_0;\n@@ -3493,15 +3493,15 @@\n \n \t\t\t\t\tpushFollow(FOLLOW_window_frame_boundary_in_window_range_expression965);\n \t\t\t\t\tend=window_frame_boundary();\n \t\t\t\t\tstate._fsp--;\n \t\t\t\t\tif (state.failed) return retval;\n \t\t\t\t\tif ( state.backtracking==0 ) stream_window_frame_boundary.add(end.getTree());\n \t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t// elements: end, s\n+\t\t\t\t\t// elements: s, end\n \t\t\t\t\t// token labels: \n \t\t\t\t\t// rule labels: s, end, retval\n \t\t\t\t\t// token list labels: \n \t\t\t\t\t// rule list labels: \n \t\t\t\t\t// wildcard labels: \n \t\t\t\t\tif ( state.backtracking==0 ) {\n \t\t\t\t\tretval.tree = root_0;\n@@ -3683,15 +3683,15 @@\n \n \t\t\t\t\tpushFollow(FOLLOW_window_frame_boundary_in_window_value_expression1029);\n \t\t\t\t\tend=window_frame_boundary();\n \t\t\t\t\tstate._fsp--;\n \t\t\t\t\tif (state.failed) return retval;\n \t\t\t\t\tif ( state.backtracking==0 ) stream_window_frame_boundary.add(end.getTree());\n \t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t// elements: s, end\n+\t\t\t\t\t// elements: end, s\n \t\t\t\t\t// token labels: \n \t\t\t\t\t// rule labels: s, end, retval\n \t\t\t\t\t// token list labels: \n \t\t\t\t\t// rule list labels: \n \t\t\t\t\t// wildcard labels: \n \t\t\t\t\tif ( state.backtracking==0 ) {\n \t\t\t\t\tretval.tree = root_0;\n@@ -3809,15 +3809,15 @@\n \t\t\t\t\tKW_UNBOUNDED73=(Token)match(input,KW_UNBOUNDED,FOLLOW_KW_UNBOUNDED_in_window_frame_start_boundary1064); if (state.failed) return retval; \n \t\t\t\t\tif ( state.backtracking==0 ) stream_KW_UNBOUNDED.add(KW_UNBOUNDED73);\n \n \t\t\t\t\tKW_PRECEDING74=(Token)match(input,KW_PRECEDING,FOLLOW_KW_PRECEDING_in_window_frame_start_boundary1066); if (state.failed) return retval; \n \t\t\t\t\tif ( state.backtracking==0 ) stream_KW_PRECEDING.add(KW_PRECEDING74);\n \n \t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t// elements: KW_PRECEDING, KW_UNBOUNDED\n+\t\t\t\t\t// elements: KW_UNBOUNDED, KW_PRECEDING\n \t\t\t\t\t// token labels: \n \t\t\t\t\t// rule labels: retval\n \t\t\t\t\t// token list labels: \n \t\t\t\t\t// rule list labels: \n \t\t\t\t\t// wildcard labels: \n \t\t\t\t\tif ( state.backtracking==0 ) {\n \t\t\t\t\tretval.tree = root_0;\n@@ -3886,15 +3886,15 @@\n \t\t\t\t\tNumber77=(Token)match(input,Number,FOLLOW_Number_in_window_frame_start_boundary1097); if (state.failed) return retval; \n \t\t\t\t\tif ( state.backtracking==0 ) stream_Number.add(Number77);\n \n \t\t\t\t\tKW_PRECEDING78=(Token)match(input,KW_PRECEDING,FOLLOW_KW_PRECEDING_in_window_frame_start_boundary1099); if (state.failed) return retval; \n \t\t\t\t\tif ( state.backtracking==0 ) stream_KW_PRECEDING.add(KW_PRECEDING78);\n \n \t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t// elements: KW_PRECEDING, Number\n+\t\t\t\t\t// elements: Number, KW_PRECEDING\n \t\t\t\t\t// token labels: \n \t\t\t\t\t// rule labels: retval\n \t\t\t\t\t// token list labels: \n \t\t\t\t\t// rule list labels: \n \t\t\t\t\t// wildcard labels: \n \t\t\t\t\tif ( state.backtracking==0 ) {\n \t\t\t\t\tretval.tree = root_0;\n@@ -4156,15 +4156,15 @@\n \n \t\t\t\t\t\t\t}\n \t\t\t\t\t\t\tbreak;\n \n \t\t\t\t\t}\n \n \t\t\t\t\t// AST REWRITE\n-\t\t\t\t\t// elements: d, Number\n+\t\t\t\t\t// elements: Number, d\n \t\t\t\t\t// token labels: d\n \t\t\t\t\t// rule labels: retval\n \t\t\t\t\t// token list labels: \n \t\t\t\t\t// rule list labels: \n \t\t\t\t\t// wildcard labels: \n \t\t\t\t\tif ( state.backtracking==0 ) {\n \t\t\t\t\tretval.tree = root_0;\n"}]}
