Skip to content

Iter

Bases: PyoIterator[T]

A superset around Python's built-in Iterator Protocol, providing a rich set of functional programming tools.

Implements the Iterator Protocol from collections.abc, so it can be used as a standard iterator.

It also provides a __bool__() method to check for emptiness without consuming elements.

  • An Iterable is any object capable of returning its members one at a time, permitting it to be iterated over in a for-loop.
  • An Iterator is an object representing a stream of data; returned by calling iter() on an Iterable.
  • Once an Iterator is exhausted, it cannot be reused or reset.

It's designed around lazy evaluation, allowing for efficient processing of large datasets.

Instanciating it from any Iterable (like lists, sets, generators, etc.) is free and efficient (it only calls the builtin iter() on the input).

Once an Iter is created, it can be transformed and manipulated using a variety of chainable methods.

However, keep in mind that Iter instances are single-use; once exhausted, they cannot be reused or reset.

If you need to reuse the data, consider collecting it into a collection first with .collect(), or clone it .cloned() to create an independent copy.

You can always convert back to an Iter using .iter() for free on any pyochain collection type.

In general, avoid intermediate references when dealing with lazy iterators, and prioritize method chaining instead.

Parameters:

Name Type Description Default
data Iterable[T]

Any object that can be iterated over.

required
Source code in src/pyochain/_iter.py
 572
 573
 574
 575
 576
 577
 578
 579
 580
 581
 582
 583
 584
 585
 586
 587
 588
 589
 590
 591
 592
 593
 594
 595
 596
 597
 598
 599
 600
 601
 602
 603
 604
 605
 606
 607
 608
 609
 610
 611
 612
 613
 614
 615
 616
 617
 618
 619
 620
 621
 622
 623
 624
 625
 626
 627
 628
 629
 630
 631
 632
 633
 634
 635
 636
 637
 638
 639
 640
 641
 642
 643
 644
 645
 646
 647
 648
 649
 650
 651
 652
 653
 654
 655
 656
 657
 658
 659
 660
 661
 662
 663
 664
 665
 666
 667
 668
 669
 670
 671
 672
 673
 674
 675
 676
 677
 678
 679
 680
 681
 682
 683
 684
 685
 686
 687
 688
 689
 690
 691
 692
 693
 694
 695
 696
 697
 698
 699
 700
 701
 702
 703
 704
 705
 706
 707
 708
 709
 710
 711
 712
 713
 714
 715
 716
 717
 718
 719
 720
 721
 722
 723
 724
 725
 726
 727
 728
 729
 730
 731
 732
 733
 734
 735
 736
 737
 738
 739
 740
 741
 742
 743
 744
 745
 746
 747
 748
 749
 750
 751
 752
 753
 754
 755
 756
 757
 758
 759
 760
 761
 762
 763
 764
 765
 766
 767
 768
 769
 770
 771
 772
 773
 774
 775
 776
 777
 778
 779
 780
 781
 782
 783
 784
 785
 786
 787
 788
 789
 790
 791
 792
 793
 794
 795
 796
 797
 798
 799
 800
 801
 802
 803
 804
 805
 806
 807
 808
 809
 810
 811
 812
 813
 814
 815
 816
 817
 818
 819
 820
 821
 822
 823
 824
 825
 826
 827
 828
 829
 830
 831
 832
 833
 834
 835
 836
 837
 838
 839
 840
 841
 842
 843
 844
 845
 846
 847
 848
 849
 850
 851
 852
 853
 854
 855
 856
 857
 858
 859
 860
 861
 862
 863
 864
 865
 866
 867
 868
 869
 870
 871
 872
 873
 874
 875
 876
 877
 878
 879
 880
 881
 882
 883
 884
 885
 886
 887
 888
 889
 890
 891
 892
 893
 894
 895
 896
 897
 898
 899
 900
 901
 902
 903
 904
 905
 906
 907
 908
 909
 910
 911
 912
 913
 914
 915
 916
 917
 918
 919
 920
 921
 922
 923
 924
 925
 926
 927
 928
 929
 930
 931
 932
 933
 934
 935
 936
 937
 938
 939
 940
 941
 942
 943
 944
 945
 946
 947
 948
 949
 950
 951
 952
 953
 954
 955
 956
 957
 958
 959
 960
 961
 962
 963
 964
 965
 966
 967
 968
 969
 970
 971
 972
 973
 974
 975
 976
 977
 978
 979
 980
 981
 982
 983
 984
 985
 986
 987
 988
 989
 990
 991
 992
 993
 994
 995
 996
 997
 998
 999
1000
1001
1002
1003
1004
1005
1006
1007
1008
1009
1010
1011
1012
1013
1014
1015
1016
1017
1018
1019
1020
1021
1022
1023
1024
1025
1026
1027
1028
1029
1030
1031
1032
1033
1034
1035
1036
1037
1038
1039
1040
1041
1042
1043
1044
1045
1046
1047
1048
1049
1050
1051
1052
1053
1054
1055
1056
1057
1058
1059
1060
1061
1062
1063
1064
1065
1066
1067
1068
1069
1070
1071
1072
1073
1074
1075
1076
1077
1078
1079
1080
1081
1082
1083
1084
1085
1086
1087
1088
1089
1090
1091
1092
1093
1094
1095
1096
1097
1098
1099
1100
1101
1102
1103
1104
1105
1106
1107
1108
1109
1110
1111
1112
1113
1114
1115
1116
1117
1118
1119
1120
1121
1122
1123
1124
1125
1126
1127
1128
1129
1130
1131
1132
1133
1134
1135
1136
1137
1138
1139
1140
1141
1142
1143
1144
1145
1146
1147
1148
1149
1150
1151
1152
1153
1154
1155
1156
1157
1158
1159
1160
1161
1162
1163
1164
1165
1166
1167
1168
1169
1170
1171
1172
1173
1174
1175
1176
1177
1178
1179
1180
1181
1182
1183
1184
1185
1186
1187
1188
1189
1190
1191
1192
1193
1194
1195
1196
1197
1198
1199
1200
1201
1202
1203
1204
1205
1206
1207
1208
1209
1210
1211
1212
1213
1214
1215
1216
1217
1218
1219
1220
1221
1222
1223
1224
1225
1226
1227
1228
1229
1230
1231
1232
1233
1234
1235
1236
1237
1238
1239
1240
1241
1242
1243
1244
1245
1246
1247
1248
1249
1250
1251
1252
1253
1254
1255
1256
1257
1258
1259
1260
1261
1262
1263
1264
1265
1266
1267
1268
1269
1270
1271
1272
1273
1274
1275
1276
1277
1278
1279
1280
1281
1282
1283
1284
1285
1286
1287
1288
1289
1290
1291
1292
1293
1294
1295
1296
1297
1298
1299
1300
1301
1302
1303
1304
1305
1306
1307
1308
1309
1310
1311
1312
1313
1314
1315
1316
1317
1318
1319
1320
1321
1322
1323
1324
1325
1326
1327
1328
1329
1330
1331
1332
1333
1334
1335
1336
1337
1338
1339
1340
1341
1342
1343
1344
1345
1346
1347
1348
1349
1350
1351
1352
1353
1354
1355
1356
1357
1358
1359
1360
1361
1362
1363
1364
1365
1366
1367
1368
1369
1370
1371
1372
1373
1374
1375
1376
1377
1378
1379
1380
1381
1382
1383
1384
1385
1386
1387
1388
1389
1390
1391
1392
1393
1394
1395
1396
1397
1398
1399
1400
1401
1402
1403
1404
1405
1406
1407
1408
1409
1410
1411
1412
1413
1414
1415
1416
1417
1418
1419
1420
1421
1422
1423
1424
1425
1426
1427
1428
1429
1430
1431
1432
1433
1434
1435
1436
1437
1438
1439
1440
1441
1442
1443
1444
1445
1446
1447
1448
1449
1450
1451
1452
1453
1454
1455
1456
1457
1458
1459
1460
1461
1462
1463
1464
1465
1466
1467
1468
1469
1470
1471
1472
1473
1474
1475
1476
1477
1478
1479
1480
1481
1482
1483
1484
1485
1486
1487
1488
1489
1490
1491
1492
1493
1494
1495
1496
1497
1498
1499
1500
1501
1502
1503
1504
1505
1506
1507
1508
1509
1510
1511
1512
1513
1514
1515
1516
1517
1518
1519
1520
1521
1522
1523
1524
1525
1526
1527
1528
1529
1530
1531
1532
1533
1534
1535
1536
1537
1538
1539
1540
1541
1542
1543
1544
1545
1546
1547
1548
1549
1550
1551
1552
1553
1554
1555
1556
1557
1558
1559
1560
1561
1562
1563
1564
1565
1566
1567
1568
1569
1570
1571
1572
1573
1574
1575
1576
1577
1578
1579
1580
1581
1582
1583
1584
1585
1586
1587
1588
1589
1590
1591
1592
1593
1594
1595
1596
1597
1598
1599
1600
1601
1602
1603
1604
1605
1606
1607
1608
1609
1610
1611
1612
1613
1614
1615
1616
1617
1618
1619
1620
1621
1622
1623
1624
1625
1626
1627
1628
1629
1630
1631
1632
1633
1634
1635
1636
1637
1638
1639
1640
1641
1642
1643
1644
1645
1646
1647
1648
1649
1650
1651
1652
1653
1654
1655
1656
1657
1658
1659
1660
1661
1662
1663
1664
1665
1666
1667
1668
1669
1670
1671
1672
1673
1674
1675
1676
1677
1678
1679
1680
1681
1682
1683
1684
1685
1686
1687
1688
1689
1690
1691
1692
1693
1694
1695
1696
1697
1698
1699
1700
1701
1702
1703
1704
1705
1706
1707
1708
1709
1710
1711
1712
1713
1714
1715
1716
1717
1718
1719
1720
1721
1722
1723
1724
1725
1726
1727
1728
1729
1730
1731
1732
1733
1734
1735
1736
1737
1738
1739
1740
1741
1742
1743
1744
1745
1746
1747
1748
1749
1750
1751
1752
1753
1754
1755
1756
1757
1758
1759
1760
1761
1762
1763
1764
1765
1766
1767
1768
1769
1770
1771
1772
1773
1774
1775
1776
1777
1778
1779
1780
1781
1782
1783
1784
1785
1786
1787
1788
1789
1790
1791
1792
1793
1794
1795
1796
1797
1798
1799
1800
1801
1802
1803
1804
1805
1806
1807
1808
1809
1810
1811
1812
1813
1814
1815
1816
1817
1818
1819
1820
1821
1822
1823
1824
1825
1826
1827
1828
1829
1830
1831
1832
1833
1834
1835
1836
1837
1838
1839
1840
1841
1842
1843
1844
1845
1846
1847
1848
1849
1850
1851
1852
1853
1854
1855
1856
1857
1858
1859
1860
1861
1862
1863
1864
1865
1866
1867
1868
1869
1870
1871
1872
1873
1874
1875
1876
1877
1878
1879
1880
1881
1882
1883
1884
1885
1886
1887
1888
1889
1890
1891
1892
1893
1894
1895
1896
1897
1898
1899
1900
1901
1902
1903
1904
1905
1906
1907
1908
1909
1910
1911
1912
1913
1914
1915
1916
1917
1918
1919
1920
1921
1922
1923
1924
1925
1926
1927
1928
1929
1930
1931
1932
1933
1934
1935
1936
1937
1938
1939
1940
1941
1942
1943
1944
1945
1946
1947
1948
1949
1950
1951
1952
1953
1954
1955
1956
1957
1958
1959
1960
1961
1962
1963
1964
1965
1966
1967
1968
1969
1970
1971
1972
1973
1974
1975
1976
1977
1978
1979
1980
1981
1982
1983
1984
1985
1986
1987
1988
1989
1990
1991
1992
1993
1994
1995
1996
1997
1998
1999
2000
2001
2002
2003
2004
2005
2006
2007
2008
2009
2010
2011
2012
2013
2014
2015
2016
2017
2018
2019
2020
2021
2022
2023
2024
2025
2026
2027
2028
2029
2030
2031
2032
2033
2034
2035
2036
2037
2038
2039
2040
2041
2042
2043
2044
2045
2046
2047
2048
2049
2050
2051
2052
2053
2054
2055
2056
2057
2058
2059
2060
2061
2062
2063
2064
2065
2066
2067
2068
2069
2070
2071
2072
2073
2074
2075
2076
2077
2078
2079
2080
2081
2082
2083
2084
2085
2086
2087
2088
2089
2090
2091
2092
2093
2094
2095
2096
2097
2098
2099
2100
2101
2102
2103
2104
2105
2106
2107
2108
2109
2110
2111
2112
2113
2114
2115
2116
2117
2118
2119
2120
2121
2122
2123
2124
2125
2126
2127
2128
2129
2130
2131
2132
2133
2134
2135
2136
2137
2138
2139
2140
2141
2142
2143
2144
2145
2146
2147
2148
2149
2150
2151
2152
2153
2154
2155
2156
2157
2158
2159
2160
2161
2162
2163
2164
2165
2166
2167
2168
2169
2170
2171
2172
2173
2174
2175
2176
2177
2178
2179
2180
2181
2182
2183
2184
2185
2186
2187
2188
2189
2190
2191
2192
2193
2194
2195
2196
2197
2198
2199
2200
2201
2202
2203
2204
2205
2206
2207
2208
2209
2210
2211
2212
2213
2214
2215
2216
2217
2218
2219
2220
2221
2222
2223
2224
2225
2226
2227
2228
2229
2230
2231
2232
2233
2234
2235
2236
2237
2238
2239
2240
2241
2242
2243
2244
2245
2246
2247
2248
2249
2250
2251
2252
2253
2254
2255
2256
2257
2258
2259
2260
2261
2262
2263
2264
2265
2266
2267
2268
2269
2270
2271
2272
2273
2274
2275
2276
2277
2278
2279
2280
2281
2282
2283
2284
2285
2286
2287
2288
2289
2290
2291
2292
2293
2294
2295
2296
2297
2298
2299
2300
2301
2302
2303
2304
2305
2306
2307
2308
2309
2310
2311
2312
2313
2314
2315
2316
2317
2318
2319
2320
2321
2322
2323
2324
2325
2326
2327
2328
2329
2330
2331
2332
2333
2334
2335
2336
2337
2338
2339
2340
2341
2342
2343
2344
2345
2346
2347
2348
2349
2350
2351
2352
2353
2354
2355
2356
2357
2358
2359
2360
2361
2362
2363
2364
2365
2366
2367
2368
2369
2370
2371
2372
2373
2374
2375
2376
2377
2378
2379
2380
2381
2382
2383
2384
2385
2386
2387
2388
2389
2390
2391
2392
2393
2394
2395
2396
2397
2398
2399
2400
2401
2402
2403
2404
2405
2406
2407
2408
2409
2410
2411
2412
2413
2414
2415
2416
2417
2418
2419
2420
2421
2422
2423
2424
2425
2426
2427
2428
2429
2430
2431
2432
2433
2434
2435
2436
2437
2438
2439
2440
2441
2442
2443
2444
2445
2446
2447
2448
2449
2450
2451
2452
2453
2454
2455
2456
2457
2458
2459
2460
2461
2462
2463
2464
2465
2466
2467
2468
2469
2470
2471
2472
2473
2474
2475
2476
2477
2478
2479
2480
2481
2482
2483
2484
2485
2486
2487
2488
2489
2490
2491
2492
2493
2494
2495
2496
2497
2498
2499
2500
2501
2502
2503
2504
2505
2506
2507
2508
2509
2510
2511
2512
2513
2514
2515
2516
2517
2518
2519
2520
2521
2522
2523
2524
2525
2526
2527
2528
2529
2530
2531
2532
2533
2534
2535
2536
2537
2538
2539
2540
2541
2542
2543
2544
2545
2546
2547
2548
2549
2550
2551
2552
2553
2554
2555
2556
2557
2558
2559
2560
2561
2562
2563
2564
2565
2566
2567
2568
2569
2570
2571
2572
2573
2574
2575
2576
2577
2578
2579
2580
2581
2582
2583
2584
2585
2586
2587
2588
2589
2590
2591
2592
2593
2594
2595
2596
2597
2598
2599
2600
2601
2602
2603
2604
2605
2606
2607
2608
2609
2610
2611
2612
2613
2614
2615
2616
2617
2618
2619
2620
2621
2622
2623
2624
2625
2626
2627
2628
2629
2630
2631
2632
2633
2634
2635
2636
2637
2638
2639
2640
2641
2642
2643
2644
2645
2646
2647
2648
2649
2650
2651
2652
2653
2654
2655
2656
2657
2658
2659
2660
2661
2662
2663
2664
2665
2666
2667
2668
2669
2670
2671
2672
2673
2674
2675
2676
2677
2678
2679
2680
2681
2682
2683
2684
2685
2686
2687
2688
2689
2690
2691
2692
2693
2694
2695
2696
2697
2698
2699
2700
2701
2702
2703
2704
2705
2706
2707
2708
2709
2710
2711
2712
2713
2714
2715
2716
2717
2718
2719
2720
2721
2722
2723
2724
2725
2726
2727
2728
2729
2730
2731
2732
2733
2734
2735
2736
2737
2738
2739
2740
2741
2742
2743
2744
2745
2746
2747
2748
2749
2750
2751
2752
2753
2754
2755
2756
2757
2758
2759
2760
2761
2762
2763
2764
2765
2766
2767
2768
2769
2770
2771
2772
2773
2774
2775
2776
2777
2778
2779
2780
2781
2782
2783
2784
2785
2786
2787
2788
2789
2790
2791
2792
2793
2794
2795
2796
2797
2798
2799
2800
2801
2802
2803
2804
2805
2806
2807
2808
2809
2810
2811
2812
2813
2814
2815
2816
2817
2818
2819
2820
2821
2822
2823
2824
2825
2826
2827
2828
2829
2830
2831
2832
2833
2834
2835
2836
2837
2838
2839
2840
2841
2842
2843
2844
2845
2846
2847
2848
2849
2850
2851
2852
2853
2854
2855
2856
2857
2858
2859
2860
2861
2862
2863
2864
2865
2866
2867
2868
2869
2870
2871
2872
2873
2874
2875
2876
2877
2878
2879
2880
2881
2882
2883
2884
2885
2886
2887
2888
2889
2890
2891
2892
2893
2894
2895
2896
2897
2898
2899
2900
2901
2902
2903
2904
2905
2906
2907
2908
2909
2910
2911
2912
2913
2914
2915
2916
2917
2918
2919
2920
2921
2922
2923
2924
2925
2926
2927
2928
2929
2930
2931
2932
2933
2934
2935
2936
2937
2938
2939
2940
2941
2942
2943
2944
2945
2946
2947
2948
2949
2950
2951
2952
2953
2954
2955
2956
2957
2958
2959
2960
2961
2962
2963
2964
2965
2966
2967
2968
2969
2970
2971
2972
2973
2974
2975
2976
2977
2978
2979
2980
2981
2982
2983
2984
2985
2986
2987
2988
2989
2990
2991
2992
2993
2994
2995
2996
2997
2998
2999
3000
3001
3002
3003
3004
3005
3006
3007
3008
3009
3010
3011
3012
3013
3014
3015
3016
3017
3018
3019
3020
3021
3022
3023
3024
3025
3026
3027
3028
3029
3030
3031
3032
3033
3034
3035
3036
3037
3038
3039
3040
3041
3042
3043
3044
3045
3046
3047
3048
3049
3050
3051
3052
3053
3054
3055
3056
3057
3058
3059
3060
3061
3062
3063
3064
3065
3066
3067
3068
3069
3070
3071
3072
3073
3074
3075
3076
3077
3078
3079
3080
3081
3082
3083
3084
3085
3086
3087
3088
3089
3090
3091
3092
3093
3094
3095
3096
3097
3098
3099
3100
3101
3102
3103
3104
3105
3106
3107
3108
3109
3110
3111
3112
3113
3114
3115
3116
3117
3118
3119
3120
3121
3122
3123
3124
3125
3126
3127
3128
3129
3130
3131
3132
3133
3134
3135
3136
3137
3138
3139
3140
3141
3142
3143
3144
3145
3146
3147
3148
3149
3150
3151
3152
3153
3154
3155
3156
3157
3158
3159
3160
3161
3162
3163
3164
3165
3166
3167
3168
3169
3170
3171
3172
3173
3174
3175
3176
3177
3178
3179
3180
3181
3182
3183
3184
3185
3186
3187
3188
3189
3190
3191
3192
3193
3194
3195
3196
3197
3198
3199
3200
3201
3202
3203
3204
3205
3206
3207
3208
3209
3210
3211
3212
3213
3214
3215
3216
3217
3218
3219
3220
3221
3222
3223
3224
3225
3226
3227
3228
3229
3230
3231
3232
3233
3234
3235
3236
3237
3238
class Iter[T](PyoIterator[T]):
    """A superset around Python's built-in `Iterator` Protocol, providing a rich set of functional programming tools.

    Implements the `Iterator` Protocol from `collections.abc`, so it can be used as a standard iterator.

    It also provides a `__bool__()` method to check for emptiness without consuming elements.

    - An `Iterable` is any object capable of returning its members one at a time, permitting it to be iterated over in a for-loop.
    - An `Iterator` is an object representing a stream of data; returned by calling `iter()` on an `Iterable`.
    - Once an `Iterator` is exhausted, it cannot be reused or reset.

    It's designed around lazy evaluation, allowing for efficient processing of large datasets.

    Instanciating it from any `Iterable` (like lists, sets, generators, etc.) is free and efficient (it only calls the builtin `iter()` on the input).

    Once an `Iter` is created, it can be transformed and manipulated using a variety of chainable methods.

    However, keep in mind that `Iter` instances are single-use; once exhausted, they cannot be reused or reset.

    If you need to reuse the data, consider collecting it into a collection first with `.collect()`, or clone it `.cloned()` to create an independent copy.

    You can always convert back to an `Iter` using `.iter()` for free on any pyochain collection type.

    In general, avoid intermediate references when dealing with lazy iterators, and prioritize method chaining instead.

    Args:
        data (Iterable[T]): Any object that can be iterated over.
    """

    _inner: Iterator[T]
    __slots__ = ("_inner",)

    def __init__(self, data: Iterable[T]) -> None:
        self._inner = iter(data)

    def __iter__(self) -> Iterator[T]:
        return self._inner

    def __next__(self) -> T:
        return next(self._inner)

    def __bool__(self) -> bool:
        """Check if the `Iterator` has at least one element (mutates **self**).

        After calling this, the `Iterator` still contains all elements.

        Returns:
            bool: True if the `Iterator` has at least one element, False otherwise.

        Examples:
        ```python
        >>> import pyochain as pc
        >>> it = pc.Iter([1, 2, 3])
        >>> bool(it)
        True
        >>> it.collect()  # All elements still available
        Seq(1, 2, 3)

        ```
        """
        first = tuple(itertools.islice(self._inner, 1))
        self._inner = itertools.chain(first, self._inner)
        return len(first) > 0

    def __repr__(self) -> str:
        return f"{self.__class__.__name__}({self._inner.__repr__()})"

    @classmethod
    def from_ref(cls, other: Self) -> Self:
        """Create an independent lazy copy from another `Iter`.

        Both the original and the returned `Iter` can be consumed independently, in a lazy manner.

        Note:
            Values consumed by one iterator remain in the shared buffer until the other iterator consumes them too.

            This is the unavoidable cost of having two independent iterators over the same source.

            However, once both iterators have passed a value, it's freed from memory.

        See Also:
            - `Iter.cloned()` which is the instance method version of this function.

        Args:
            other (Self): An `Iter` instance to copy.

        Returns:
            Self: A new `Iter` instance that is independent from the original.

        Example:
        ```python
        >>> import pyochain as pc
        >>> original = pc.Iter([1, 2, 3])
        >>> copy = pc.Iter.from_ref(original)
        >>> copy.map(lambda x: x * 2).collect()
        Seq(2, 4, 6)
        >>> original.next()
        Some(1)

        ```
        """
        it1, it2 = itertools.tee(other._inner)
        other._inner = it1
        return cls(it2)

    @staticmethod
    def once[V](value: V) -> Iter[V]:
        """Create an `Iter` that yields a single value.

        If you have a function which works on iterators, but you only need to process one value, you can use this method rather than doing something like `Iter([value])`.

        This can be considered the equivalent of `.insert()` but as a constructor.

        Args:
            value (V): The single value to yield.

        Returns:
            Iter[V]: An iterator yielding the specified value.

        Example:
        ```python
        >>> import pyochain as pc
        >>> pc.Iter.once(42).collect()
        Seq(42,)

        ```
        """
        return Iter((value,))

    @staticmethod
    def once_with[**P, R](
        func: Callable[P, R], *args: P.args, **kwargs: P.kwargs
    ) -> Iter[R]:
        """Create an `Iter`  that lazily generates a value exactly once by invoking the provided closure.

        If you have a function which works on iterators, but you only need to process one value, you can use this method rather than doing something like `Iter([value])`.

        This can be considered the equivalent of `.insert()` but as a constructor.

        Unlike `.once()`, this function will lazily generate the value on request.

        Args:
            func (Callable[P, R]): The single value to yield.
            *args (P.args): Positional arguments to pass to **func**.
            **kwargs (P.kwargs): Keyword arguments to pass to **func**.

        Returns:
            Iter[R]: An iterator yielding the specified value.

        Example:
        ```python
        >>> import pyochain as pc
        >>> pc.Iter.once(42).collect()
        Seq(42,)

        ```
        """

        def _once_with() -> Generator[R]:
            yield func(*args, **kwargs)

        return Iter(_once_with())

    @staticmethod
    def from_count(start: int = 0, step: int = 1) -> Iter[int]:
        """Create an infinite `Iterator` of evenly spaced values.

        Warning:
            This creates an infinite iterator.

            Be sure to use `Iter.take()` or `Iter.slice()` to limit the number of items taken.

        Args:
            start (int): Starting value of the sequence.
            step (int): Difference between consecutive values.

        Returns:
            Iter[int]: An iterator generating the sequence.

        Example:
        ```python
        >>> import pyochain as pc
        >>> pc.Iter.from_count(10, 2).take(3).collect()
        Seq(10, 12, 14)

        ```
        """
        return Iter(itertools.count(start, step))

    @staticmethod
    def from_fn[R](f: Callable[[], Option[R]]) -> Iter[R]:
        """Create an `Iter` from a nullary generator function.

        The callable must return:

        - `Some(value)` to yield a value
        - `NONE` to stop


        Args:
            f (Callable[[], Option[R]]): Callable that returns the next item wrapped in `Option`.

        Returns:
            Iter[R]: An iterator yielding values produced by **f**.

        Example:
        ```python
        >>> import pyochain as pc
        >>> counter = 0
        >>> def gen() -> pc.Option[int]:
        ...     global counter
        ...     counter += 1
        ...     return pc.Some(counter) if counter < 6 else pc.NONE
        >>> pc.Iter.from_fn(gen).collect()
        Seq(1, 2, 3, 4, 5)

        ```
        """

        def _from_fn() -> Iterator[R]:
            while True:
                item = f()
                if item.is_none():
                    return
                yield item.unwrap()

        return Iter(_from_fn())

    @staticmethod
    def successors[U](first: Option[U], succ: Callable[[U], Option[U]]) -> Iter[U]:
        """Create an iterator of successive values computed from the previous one.

        The iterator yields `first` (if it is `Some`), then repeatedly applies **succ** to the
        previous yielded value until it returns `NONE`.

        Args:
            first (Option[U]): Initial item.
            succ (Callable[[U], Option[U]]): Successor function.

        Returns:
            Iter[U]: Iterator yielding `first` and its successors.

        Example:
        ```python
        >>> import pyochain as pc
        >>> def next_pow10(x: int) -> pc.Option[int]:
        ...     return pc.Some(x * 10) if x < 10_000 else pc.NONE
        >>> pc.Iter.successors(pc.Some(1), next_pow10).collect()
        Seq(1, 10, 100, 1000, 10000)

        ```
        """

        def _successors() -> Iterator[U]:
            current = first
            while current.is_some():
                value = current.unwrap()
                yield value
                current = succ(value)

        return Iter(_successors())

    def collect[R: Collection[Any]](
        self, collector: Callable[[Iterator[T]], R] = Seq[T]
    ) -> R:
        """Transforms an `Iter` into a collection.

        The most basic pattern in which collect() is used is to turn one collection into another.

        You take a collection, call `iter()` on it, do a bunch of transformations, and then `collect()` at the end.

        You can specify the target collection type by providing a **collector** function or type.

        This can be any `Callable` that takes an `Iterator[T]` and returns a `Collection[T]` of those types.

        Note:
            This can be tought as `.into()` with a default value (`Seq[T]`), and a different constraint (`Collection[Any]`).
            However, the runtime behavior is identical in both cases: pass **self** to the provided function, return the result.

        Args:
            collector (Callable[[Iterator[T]], R]): Function|type that defines the target collection. `R` is constrained to a `Collection`.

        Returns:
            R: A materialized collection containing the collected elements.

        Example:
        ```python
        >>> import pyochain as pc
        >>> pc.Iter(range(5)).collect()
        Seq(0, 1, 2, 3, 4)
        >>> iterator = pc.Iter((1, 2, 3))
        >>> iterator._inner.__class__.__name__
        'tuple_iterator'
        >>> mapped = iterator.map(lambda x: x * 2)
        >>> mapped._inner.__class__.__name__
        'map'
        >>> mapped.collect()
        Seq(2, 4, 6)
        >>> # iterator is now exhausted
        >>> iterator.collect()
        Seq()
        >>> pc.Iter(range(5)).collect(list)
        [0, 1, 2, 3, 4]
        >>> pc.Iter(range(5)).collect(pc.Vec)
        Vec(0, 1, 2, 3, 4)
        >>> iterator = pc.Iter([1, 2, 3])
        >>> iterator._inner.__class__.__name__
        'list_iterator'

        ```
        """
        return collector(self._inner)

    @overload
    def collect_into(self, collection: Vec[T]) -> Vec[T]: ...
    @overload
    def collect_into(self, collection: list[T]) -> list[T]: ...
    def collect_into(self, collection: MutableSequence[T]) -> MutableSequence[T]:
        """Collects all the items from the `Iterator` into a `MutableSequence`.

        This method consumes the `Iterator` and adds all its items to the passed `MutableSequence`.

        The `MutableSequence` is then returned, so the call chain can be continued.

        This is useful when you already have a `MutableSequence` and want to add the `Iterator` items to it.

        This method is a convenience method to call `MutableSequence.extend()`, but instead of being called on a `MutableSequence`, it's called on an `Iterator`.

        Args:
            collection (MutableSequence[T]): A mutable collection to collect items into.

        Returns:
            MutableSequence[T]: The same mutable collection passed as argument, now containing the collected items.

        Example:
        Basic usage:
        ```python
        >>> import pyochain as pc
        >>> a = pc.Seq([1, 2, 3])
        >>> vec = pc.Vec([0, 1])
        >>> a.iter().map(lambda x: x * 2).collect_into(vec)
        Vec(0, 1, 2, 4, 6)
        >>> a.iter().map(lambda x: x * 10).collect_into(vec)
        Vec(0, 1, 2, 4, 6, 10, 20, 30)

        ```
        The returned mutable sequence can be used to continue the call chain:
        ```python
        >>> import pyochain as pc
        >>> a = pc.Seq([1, 2, 3])
        >>> vec = pc.Vec[int].new()
        >>> a.iter().collect_into(vec).length() == vec.length()
        True
        >>> a.iter().collect_into(vec).length() == vec.length()
        True

        ```
        """
        collection.extend(self._inner)
        return collection

    def try_collect[U](self: Iter[Option[U]] | Iter[Result[U, Any]]) -> Option[Vec[U]]:
        """Fallibly transforms **self** into a `Vec`, short circuiting if a failure is encountered.

        `try_collect()` is a variation of `collect()` that allows fallible conversions during collection.

        Its main use case is simplifying conversions from iterators yielding `Option[T]` or `Result[T, E]` into `Option[Vec[T]]`.

        Also, if a failure is encountered during `try_collect()`, the `Iter` is still valid and may continue to be used, in which case it will continue iterating starting after the element that triggered the failure.

        See the last example below for an example of how this works.

        Note:
            This method return `Vec[U]` instead of being customizable, because the underlying data structure must be mutable in order to build up the collection.

        Returns:
            Option[Vec[U]]: `Some[Vec[U]]` if all elements were successfully collected, or `NONE` if a failure was encountered.

        Example:
        ```python
        >>> import pyochain as pc
        >>> # Successfully collecting an iterator of Option[int] into Option[Vec[int]]:
        >>> pc.Iter([pc.Some(1), pc.Some(2), pc.Some(3)]).try_collect()
        Some(Vec(1, 2, 3))
        >>> # Failing to collect in the same way:
        >>> pc.Iter([pc.Some(1), pc.Some(2), pc.NONE, pc.Some(3)]).try_collect()
        NONE
        >>> # A similar example, but with Result:
        >>> pc.Iter([pc.Ok(1), pc.Ok(2), pc.Ok(3)]).try_collect()
        Some(Vec(1, 2, 3))
        >>> pc.Iter([pc.Ok(1), pc.Err("error"), pc.Ok(3)]).try_collect()
        NONE
        >>> def external_fn(x: int) -> pc.Option[int]:
        ...     if x % 2 == 0:
        ...         return pc.Some(x)
        ...     return pc.NONE
        >>> pc.Iter([1, 2, 3, 4]).map(external_fn).try_collect()
        NONE
        >>> # Demonstrating that the iterator remains usable after a failure:
        >>> it = pc.Iter([pc.Some(1), pc.NONE, pc.Some(3), pc.Some(4)])
        >>> it.try_collect()
        NONE
        >>> it.try_collect()
        Some(Vec(3, 4))

        ```
        """
        collected: list[U] = []
        collected_add = collected.append
        for item in self._inner:
            match item:
                case Ok(val) | Some(val):
                    collected_add(val)
                case _:
                    return NONE
        return Some(Vec.from_ref(collected))

    def array_chunks(self, size: int) -> Iter[Self]:
        """Yield subiterators (chunks) that each yield a fixed number elements, determined by size.

        The last chunk will be shorter if there are not enough elements.

        Args:
            size (int): Number of elements in each chunk.

        Returns:
            Iter[Self]: An iterable of iterators, each yielding n elements.

        If the sub-iterables are read in order, the elements of *iterable*
        won't be stored in memory.

        If they are read out of order, :func:`itertools.tee` is used to cache
        elements as necessary.
        ```python
        >>> import pyochain as pc
        >>> all_chunks = pc.Iter.from_count().array_chunks(4)
        >>> c_1, c_2, c_3 = all_chunks.next(), all_chunks.next(), all_chunks.next()
        >>> c_2.unwrap().collect()  # c_1's elements have been cached; c_3's haven't been
        Seq(4, 5, 6, 7)
        >>> c_1.unwrap().collect()
        Seq(0, 1, 2, 3)
        >>> c_3.unwrap().collect()
        Seq(8, 9, 10, 11)
        >>> pc.Seq([1, 2, 3, 4, 5, 6]).iter().array_chunks(3).map(lambda c: c.collect()).collect()
        Seq(Seq(1, 2, 3), Seq(4, 5, 6))
        >>> pc.Seq([1, 2, 3, 4, 5, 6, 7, 8]).iter().array_chunks(3).map(lambda c: c.collect()).collect()
        Seq(Seq(1, 2, 3), Seq(4, 5, 6), Seq(7, 8))

        ```
        """
        from collections import deque
        from contextlib import suppress

        def _chunks() -> Iterator[Self]:
            def _ichunk(
                iterator: Iterator[T], n: int
            ) -> tuple[Iterator[T], Callable[[int], int]]:
                cache: deque[T] = deque()
                chunk = itertools.islice(iterator, n)

                def _generator() -> Iterator[T]:
                    with suppress(StopIteration):
                        while True:
                            if cache:
                                yield cache.popleft()
                            else:
                                yield next(chunk)

                def _materialize_next(n: int) -> int:
                    to_cache = n - len(cache)

                    # materialize up to n
                    if to_cache > 0:
                        cache.extend(itertools.islice(chunk, to_cache))

                    # return number materialized up to n
                    return min(n, len(cache))

                return (_generator(), _materialize_next)

            new = self.__class__
            while True:
                # Create new chunk
                chunk, _materialize_next = _ichunk(self._inner, size)

                # Check to see whether we're at the end of the source iterable
                if not _materialize_next(size):
                    return

                yield new(chunk)
                _materialize_next(size)

        return Iter(_chunks())

    @overload
    def flatten[U](self: Iter[KeysView[U]]) -> Iter[U]: ...
    @overload
    def flatten[U](self: Iter[Iterable[U]]) -> Iter[U]: ...
    @overload
    def flatten[U](self: Iter[Generator[U]]) -> Iter[U]: ...
    @overload
    def flatten[U](self: Iter[ValuesView[U]]) -> Iter[U]: ...
    @overload
    def flatten[U](self: Iter[Iterator[U]]) -> Iter[U]: ...
    @overload
    def flatten[U](self: Iter[Collection[U]]) -> Iter[U]: ...
    @overload
    def flatten[U](self: Iter[Sequence[U]]) -> Iter[U]: ...
    @overload
    def flatten[U](self: Iter[list[U]]) -> Iter[U]: ...
    @overload
    def flatten[U](self: Iter[tuple[U, ...]]) -> Iter[U]: ...
    @overload
    def flatten[U](self: Iter[Iter[U]]) -> Iter[U]: ...
    @overload
    def flatten[U](self: Iter[Seq[U]]) -> Iter[U]: ...
    @overload
    def flatten[U](self: Iter[Set[U]]) -> Iter[U]: ...
    @overload
    def flatten[U](self: Iter[SetMut[U]]) -> Iter[U]: ...
    @overload
    def flatten[U](self: Iter[Vec[U]]) -> Iter[U]: ...
    @overload
    def flatten(self: Iter[range]) -> Iter[int]: ...
    def flatten[U: Iterable[Any]](self: Iter[U]) -> Iter[Any]:
        """Creates an `Iter` that flattens nested structure.

        Returns:
            Iter[Any]: An `Iter` of flattened elements.

        This is useful when you have an `Iter` of `Iterable` and you want to remove one level of indirection.

        Examples:
        Basic usage:
        ```python
        >>> import pyochain as pc
        >>> data = [[1, 2, 3, 4], [5, 6]]
        >>> flattened = pc.Iter(data).flatten().collect()
        >>> flattened
        Seq(1, 2, 3, 4, 5, 6)

        ```
        Mapping and then flattening:
        ```python
        >>> import pyochain as pc
        >>> words = pc.Iter(["alpha", "beta", "gamma"])
        >>> merged = words.flatten().collect()
        >>> merged
        Seq('a', 'l', 'p', 'h', 'a', 'b', 'e', 't', 'a', 'g', 'a', 'm', 'm', 'a')

        ```
        Flattening only removes one level of nesting at a time:
        ```python
        >>> import pyochain as pc
        >>> d3 = [[[1, 2], [3, 4]], [[5, 6], [7, 8]]]
        >>> d2 = pc.Iter(d3).flatten().collect()
        >>> d2
        Seq([1, 2], [3, 4], [5, 6], [7, 8])
        >>> d1 = pc.Iter(d3).flatten().flatten().collect()
        >>> d1
        Seq(1, 2, 3, 4, 5, 6, 7, 8)

        ```
        Here we see that `flatten()` does not perform a “deep” flatten.

        Instead, only **one** level of nesting is removed.

        That is, if you `flatten()` a three-dimensional array, the result will be two-dimensional and not one-dimensional.

        To get a one-dimensional structure, you have to `flatten()` again.

        """
        return Iter(itertools.chain.from_iterable(self._inner))

    def flat_map[R](self, func: Callable[[T], Iterable[R]]) -> Iter[R]:
        """Creates an iterator that applies a function to each element of the original iterator and flattens the result.

        This is useful when the **func** you want to pass to `.map()` itself returns an iterable, and you want to avoid having nested iterables in the output.

        This is equivalent to calling `.map(func).flatten()`.

        Args:
            func (Callable[[T], Iterable[R]]): Function to apply to each element.

        Returns:
            Iter[R]: An iterable of flattened transformed elements.

        Example:
        ```python
        >>> import pyochain as pc
        >>> pc.Iter([1, 2, 3]).flat_map(lambda x: range(x)).collect()
        Seq(0, 0, 1, 0, 1, 2)

        ```
        """
        return Iter(itertools.chain.from_iterable(map(func, self._inner)))

    def unique_to_each[U: Iterable[Any]](self: Iter[U]) -> Iter[Iter[U]]:
        """Return the elements from each of the iterators that aren't in the other iterators.

        It is assumed that the elements of each iterable are hashable.

        **Credits**

            more_itertools.unique_to_each

        Returns:
            Iter[Iter[U]]: An iterator of iterators, each containing the unique elements from the corresponding input iterable.

        For example, suppose you have a set of packages, each with a set of dependencies:

        **{'pkg_1': {'A', 'B'}, 'pkg_2': {'B', 'C'}, 'pkg_3': {'B', 'D'}}**

        If you remove one package, which dependencies can also be removed?

        If pkg_1 is removed, then A is no longer necessary - it is not associated with pkg_2 or pkg_3.

        Similarly, C is only needed for pkg_2, and D is only needed for pkg_3:

        ```python
        >>> import pyochain as pc
        >>> data = ({"A", "B"}, {"B", "C"}, {"B", "D"})
        >>> pc.Iter(data).unique_to_each().map(lambda x: x.into(list)).collect()
        Seq(['A'], ['C'], ['D'])

        ```

        If there are duplicates in one input iterable that aren't in the others they will be duplicated in the output.

        Input order is preserved:
        ```python
        >>> data = ("mississippi", "missouri")
        >>> pc.Seq(data).iter().unique_to_each().map(lambda x: x.into(list)).collect()
        Seq(['p', 'p'], ['o', 'u', 'r'])

        ```
        """
        from collections import Counter

        pool: tuple[Iterable[U], ...] = tuple(self._inner)
        counts: Counter[U] = Counter(itertools.chain.from_iterable(map(set, pool)))
        uniques: set[U] = {element for element in counts if counts[element] == 1}

        return Iter((Iter(filter(uniques.__contains__, it))) for it in pool)

    def split_into(self, *sizes: Option[int]) -> Iter[Self]:
        """Yield a list of sequential items from iterable of length 'n' for each integer 'n' in sizes.

        Args:
            *sizes (Option[int]): `Some` integers specifying the sizes of each chunk. Use `NONE` for the remainder.

        Returns:
            Iter[Self]: An iterator of iterators, each containing a chunk of the original iterable.

        If the sum of sizes is smaller than the length of iterable, then the remaining items of iterable will not be returned.

        If the sum of sizes is larger than the length of iterable:

        - fewer items will be returned in the iteration that overruns the iterable
        - further lists will be empty

        When a `NONE` object is encountered in sizes, the returned list will contain items up to the end of iterable the same way that itertools.slice does.

        split_into can be useful for grouping a series of items where the sizes of the groups are not uniform.

        An example would be where in a row from a table:

        - multiple columns represent elements of the same feature (e.g. a point represented by x,y,z)
        - the format is not the same for all columns.

        Example:
        ```python
        >>> import pyochain as pc
        >>> def _get_results(x: pc.Iter[pc.Iter[int]]) -> pc.Seq[pc.Seq[int]]:
        ...    return x.map(lambda x: x.collect()).collect()
        >>>
        >>> data = [1, 2, 3, 4, 5, 6]
        >>> pc.Iter(data).split_into(pc.Some(1), pc.Some(2), pc.Some(3)).into(_get_results)
        Seq(Seq(1,), Seq(2, 3), Seq(4, 5, 6))
        >>> pc.Iter(data).split_into(pc.Some(2), pc.Some(3)).into(_get_results)
        Seq(Seq(1, 2), Seq(3, 4, 5))
        >>> pc.Iter([1, 2, 3, 4]).split_into(pc.Some(1), pc.Some(2), pc.Some(3), pc.Some(4)).into(_get_results)
        Seq(Seq(1,), Seq(2, 3), Seq(4,), Seq())
        >>> data = [1, 2, 3, 4, 5, 6, 7, 8, 9, 0]
        >>> pc.Iter(data).split_into(pc.Some(2), pc.Some(3), pc.NONE).into(_get_results)
        Seq(Seq(1, 2), Seq(3, 4, 5), Seq(6, 7, 8, 9, 0))

        ```
        """

        def _split_into(data: Iterator[T]) -> Iterator[Self]:
            """Credits: more_itertools.split_into."""
            new = self.__class__
            for size in sizes:
                if size.is_none():
                    yield new(data)
                    return
                else:
                    yield new(itertools.islice(data, size.unwrap()))

        return Iter(_split_into(self._inner))

    def split_when(
        self,
        predicate: Callable[[T, T], bool],
        max_split: int = -1,
    ) -> Iter[Self]:
        """Split iterable into pieces based on the output of a predicate function.

        By default, no limit is placed on the number of splits.

        Args:
            predicate (Callable[[T, T], bool]): Function that takes successive pairs of items and returns True if the iterable should be split.
            max_split (int): Maximum number of splits to perform.

        Returns:
            Iter[Self]: An iterator of iterators of items.

        At most *max_split* splits are done.

        If *max_split* is not specified or -1, then there is no limit on the number of splits.

        The example below shows how to find runs of increasing numbers, by splitting the iterable when element i is larger than element i + 1.

        Example:
        ```python
        >>> import pyochain as pc
        >>> data = pc.Seq([1, 2, 3, 3, 2, 5, 2, 4, 2])
        >>> data.iter().split_when(lambda x, y: x > y).map(lambda x: x.collect()).collect()
        Seq(Seq(1, 2, 3, 3), Seq(2, 5), Seq(2, 4), Seq(2,))
        >>> data.iter().split_when(lambda x, y: x > y, max_split=2).map(lambda x: x.collect()).collect()
        Seq(Seq(1, 2, 3, 3), Seq(2, 5), Seq(2, 4, 2))

        ```
        """

        def _split_when(data: Iterator[T], max_split: int) -> Iterator[Self]:
            """Credits: more_itertools.split_when."""
            new = self.__class__
            if max_split == 0:
                yield self
                return
            try:
                cur_item = next(data)
            except StopIteration:
                return

            buf = [cur_item]
            for next_item in data:
                if predicate(cur_item, next_item):
                    yield new(buf)
                    if max_split == 1:
                        yield new((next_item, *data))
                        return
                    buf = []
                    max_split -= 1

                buf.append(next_item)
                cur_item = next_item

            yield new(buf)

        return Iter(_split_when(self._inner, max_split))

    def split_at(
        self,
        predicate: Callable[[T], bool],
        max_split: int = -1,
        *,
        keep_separator: bool = False,
    ) -> Iter[Self]:
        """Yield iterators of items from iterable, where each iterator is delimited by an item where `predicate` returns True.

        By default, no limit is placed on the number of splits.

        Args:
            predicate (Callable[[T], bool]): Function to determine the split points.
            max_split (int): Maximum number of splits to perform.
            keep_separator (bool): Whether to include the separator in the output.

        Returns:
            Iter[Self]: An iterator of iterators, each containing a segment of the original iterable.

        By default, the delimiting items are not included in the output.

        To include them, set *keep_separator* to `True`.
        At most *max_split* splits are done.

        If *max_split* is not specified or -1, then there is no limit on the number of splits.

        Example:
        ```python
        >>> import pyochain as pc
        >>> def _to_res(x: pc.Iter[pc.Iter[str]]) -> pc.Seq[pc.Seq[str]]:
        ...     return x.map(lambda x: x.into(list)).collect()
        >>>
        >>> pc.Iter("abcdcba").split_at(lambda x: x == "b").into(_to_res)
        Seq(['a'], ['c', 'd', 'c'], ['a'])
        >>> pc.Iter(range(10)).split_at(lambda n: n % 2 == 1).into(_to_res)
        Seq([0], [2], [4], [6], [8], [])
        >>> pc.Iter(range(10)).split_at(lambda n: n % 2 == 1, max_split=2).into(_to_res)
        Seq([0], [2], [4, 5, 6, 7, 8, 9])
        >>>
        >>> def cond(x: str) -> bool:
        ...     return x == "b"
        >>>
        >>> pc.Iter("abcdcba").split_at(cond, keep_separator=True).into(_to_res)
        Seq(['a'], ['b'], ['c', 'd', 'c'], ['b'], ['a'])

        ```
        """

        def _split_at(data: Iterator[T], max_split: int) -> Iterator[Self]:
            """Credits: more_itertools.split_at."""
            new = self.__class__
            if max_split == 0:
                yield self
                return

            buf: list[T] = []
            for item in data:
                if predicate(item):
                    yield new(buf)
                    if keep_separator:
                        yield new((item,))
                    if max_split == 1:
                        yield new(data)
                        return
                    buf = []
                    max_split -= 1
                else:
                    buf.append(item)
            yield new(buf)

        return Iter(_split_at(self._inner, max_split))

    def split_after(
        self,
        predicate: Callable[[T], bool],
        max_split: int = -1,
    ) -> Iter[Self]:
        """Yield iterator of items from iterable, where each iterator ends with an item where `predicate` returns True.

        By default, no limit is placed on the number of splits.

        Args:
            predicate (Callable[[T], bool]): Function to determine the split points.
            max_split (int): Maximum number of splits to perform.

        Returns:
            Iter[Self]: An iterable of lists of items.

        Example:
        ```python
        >>> import pyochain as pc
        >>> pc.Iter("one1two2").split_after(str.isdigit).map(list).collect()
        Seq(['o', 'n', 'e', '1'], ['t', 'w', 'o', '2'])

        >>> def cond(n: int) -> bool:
        ...     return n % 3 == 0
        >>>
        >>> pc.Iter(range(10)).split_after(cond).map(list).collect()
        Seq([0], [1, 2, 3], [4, 5, 6], [7, 8, 9])
        >>> pc.Iter(range(10)).split_after(cond, max_split=2).map(list).collect()
        Seq([0], [1, 2, 3], [4, 5, 6, 7, 8, 9])

        ```
        """

        def _split_after(data: Iterator[T], max_split: int) -> Iterator[Self]:
            """Credits: more_itertools.split_after."""
            new = self.__class__
            if max_split == 0:
                yield new(data)
                return

            buf: list[T] = []
            for item in data:
                buf.append(item)
                if predicate(item) and buf:
                    yield new(buf)
                    if max_split == 1:
                        buf = list(data)
                        if buf:
                            yield new(buf)
                        return
                    buf = []
                    max_split -= 1
            if buf:
                yield new(buf)

        return Iter(_split_after(self._inner, max_split))

    def split_before(
        self,
        predicate: Callable[[T], bool],
        max_split: int = -1,
    ) -> Iter[Self]:
        """Yield iterator of items from iterable, where each iterator ends with an item where `predicate` returns True.

        By default, no limit is placed on the number of splits.

        Args:
            predicate (Callable[[T], bool]): Function to determine the split points.
            max_split (int): Maximum number of splits to perform.

        Returns:
            Iter[Self]: An iterable of lists of items.


        At most *max_split* are done.


        If *max_split* is not specified or -1, then there is no limit on the number of splits:

        Example:
        ```python
        >>> import pyochain as pc
        >>> pc.Iter("abcdcba").split_before(lambda x: x == "b").map(list).collect()
        Seq(['a'], ['b', 'c', 'd', 'c'], ['b', 'a'])
        >>>
        >>> def cond(n: int) -> bool:
        ...     return n % 2 == 1
        >>>
        >>> pc.Iter(range(10)).split_before(cond).map(list).collect()
        Seq([0], [1, 2], [3, 4], [5, 6], [7, 8], [9])
        >>> pc.Iter(range(10)).split_before(cond, max_split=2).map(list).collect()
        Seq([0], [1, 2], [3, 4, 5, 6, 7, 8, 9])

        ```
        """

        def _split_before(data: Iterator[T], max_split: int) -> Iterator[Self]:
            """Credits: more_itertools.split_before."""
            new = self.__class__

            if max_split == 0:
                yield new(data)
                return

            buf: list[T] = []
            for item in data:
                if predicate(item) and buf:
                    yield new(buf)
                    if max_split == 1:
                        yield new([item, *data])
                        return
                    buf = []
                    max_split -= 1
                buf.append(item)
            if buf:
                yield new(buf)

        return Iter(_split_before(self._inner, max_split))

    def find_map[R](self, func: Callable[[T], Option[R]]) -> Option[R]:
        """Applies function to the elements of the `Iterator` and returns the first Some(R) result.

        `Iter.find_map(f)` is equivalent to `Iter.filter_map(f).next()`.

        Args:
            func (Callable[[T], Option[R]]): Function to apply to each element, returning an `Option[R]`.

        Returns:
            Option[R]: The first `Some(R)` result from applying `func`, or `NONE` if no such result is found.

        Example:
        ```python
        >>> import pyochain as pc
        >>> def _parse(s: str) -> pc.Option[int]:
        ...     try:
        ...         return pc.Some(int(s))
        ...     except ValueError:
        ...         return pc.NONE
        >>>
        >>> pc.Iter(["lol", "NaN", "2", "5"]).find_map(_parse)
        Some(2)

        ```
        """
        return self.filter_map(func).next()

    # map -----------------------------------------------------------------

    def map[R](self, func: Callable[[T], R]) -> Iter[R]:
        """Apply a function **func** to each element of the `Iter`.

        If you are good at thinking in types, you can think of `Iter.map()` like this:

        - You have an `Iterator` that gives you elements of some type `A`
        - You want an `Iterator` of some other type `B`
        - Thenyou can use `.map()`, passing a closure **func** that takes an `A` and returns a `B`.

        `Iter.map()` is conceptually similar to a for loop.

        However, as `Iter.map()` is lazy, it is best used when you are already working with other `Iter` instances.

        If you are doing some sort of looping for a side effect, it is considered more idiomatic to use `Iter.for_each()` than `Iter.map().collect()`.

        Args:
            func (Callable[[T], R]): Function to apply to each element.

        Returns:
            Iter[R]: An iterator of transformed elements.

        Example:
        ```python
        >>> import pyochain as pc
        >>> pc.Iter([1, 2]).map(lambda x: x + 1).collect()
        Seq(2, 3)
        >>> # You can use methods on the class rather than on instance for convenience:
        >>> pc.Iter(["a", "b", "c"]).map(str.upper).collect()
        Seq('A', 'B', 'C')
        >>> pc.Iter(["a", "b", "c"]).map(lambda s: s.upper()).collect()
        Seq('A', 'B', 'C')

        ```
        """
        return Iter(map(func, self._inner))

    @overload
    def map_star[R](
        self: Iter[tuple[Any]],
        func: Callable[[Any], R],
    ) -> Iter[R]: ...
    @overload
    def map_star[T1, T2, R](
        self: Iter[tuple[T1, T2]],
        func: Callable[[T1, T2], R],
    ) -> Iter[R]: ...
    @overload
    def map_star[T1, T2, T3, R](
        self: Iter[tuple[T1, T2, T3]],
        func: Callable[[T1, T2, T3], R],
    ) -> Iter[R]: ...
    @overload
    def map_star[T1, T2, T3, T4, R](
        self: Iter[tuple[T1, T2, T3, T4]],
        func: Callable[[T1, T2, T3, T4], R],
    ) -> Iter[R]: ...
    @overload
    def map_star[T1, T2, T3, T4, T5, R](
        self: Iter[tuple[T1, T2, T3, T4, T5]],
        func: Callable[[T1, T2, T3, T4, T5], R],
    ) -> Iter[R]: ...
    @overload
    def map_star[T1, T2, T3, T4, T5, T6, R](
        self: Iter[tuple[T1, T2, T3, T4, T5, T6]],
        func: Callable[[T1, T2, T3, T4, T5, T6], R],
    ) -> Iter[R]: ...
    @overload
    def map_star[T1, T2, T3, T4, T5, T6, T7, R](
        self: Iter[tuple[T1, T2, T3, T4, T5, T6, T7]],
        func: Callable[[T1, T2, T3, T4, T5, T6, T7], R],
    ) -> Iter[R]: ...
    @overload
    def map_star[T1, T2, T3, T4, T5, T6, T7, T8, R](
        self: Iter[tuple[T1, T2, T3, T4, T5, T6, T7, T8]],
        func: Callable[[T1, T2, T3, T4, T5, T6, T7, T8], R],
    ) -> Iter[R]: ...
    @overload
    def map_star[T1, T2, T3, T4, T5, T6, T7, T8, T9, R](
        self: Iter[tuple[T1, T2, T3, T4, T5, T6, T7, T8, T9]],
        func: Callable[[T1, T2, T3, T4, T5, T6, T7, T8, T9], R],
    ) -> Iter[R]: ...
    @overload
    def map_star[T1, T2, T3, T4, T5, T6, T7, T8, T9, T10, R](
        self: Iter[tuple[T1, T2, T3, T4, T5, T6, T7, T8, T9, T10]],
        func: Callable[[T1, T2, T3, T4, T5, T6, T7, T8, T9, T10], R],
    ) -> Iter[R]: ...
    def map_star[U: Iterable[Any], R](
        self: Iter[U],
        func: Callable[..., R],
    ) -> Iter[R]:
        """Applies a function to each element.where each element is an iterable.

        Unlike `.map()`, which passes each element as a single argument, `.starmap()` unpacks each element into positional arguments for the function.

        In short, for each element in the `Iter`, it computes `func(*element)`.

        Note:
            Always prefer using `.map_star()` over `.map()` when working with `Iter` of `tuple` elements.
            Not only it is more readable, but it's also much more performant (up to 30% faster in benchmarks).

        Args:
            func (Callable[..., R]): Function to apply to unpacked elements.

        Returns:
            Iter[R]: An iterable of results from applying the function to unpacked elements.

        Example:
        ```python
        >>> import pyochain as pc
        >>> def make_sku(color: str, size: str) -> str:
        ...     return f"{color}-{size}"
        >>> data = pc.Seq(["blue", "red"])
        >>> data.iter().product(["S", "M"]).map_star(make_sku).collect()
        Seq('blue-S', 'blue-M', 'red-S', 'red-M')
        >>> # This is equivalent to:
        >>> data.iter().product(["S", "M"]).map(lambda x: make_sku(*x)).collect()
        Seq('blue-S', 'blue-M', 'red-S', 'red-M')

        ```
        """
        return Iter(itertools.starmap(func, self._inner))

    def map_while[R](self, func: Callable[[T], Option[R]]) -> Iter[R]:
        """Creates an iterator that both yields elements based on a predicate and maps.

        `map_while()` takes a closure as an argument. It will call this closure on each element of
        the iterator, and yield elements while it returns `Some(_)`.

        After `NONE` is returned, `map_while()` stops and the rest of the elements are ignored.

        Args:
            func (Callable[[T], Option[R]]): Function to apply to each element that returns `Option[R]`.

        Returns:
            Iter[R]: An iterator of transformed elements until `NONE` is encountered.

        Example:
        ```python
        >>> import pyochain as pc
        >>> def checked_div(x: int) -> pc.Option[int]:
        ...     return pc.Some(16 // x) if x != 0 else pc.NONE
        >>>
        >>> data = pc.Iter([-1, 4, 0, 1])
        >>> data.map_while(checked_div).collect()
        Seq(-16, 4)
        >>> data = pc.Iter([0, 1, 2, -3, 4, 5, -6])
        >>> # Convert to positive ints, stop at first negative
        >>> data.map_while(lambda x: pc.Some(x) if x >= 0 else pc.NONE).collect()
        Seq(0, 1, 2)

        ```
        """

        def _gen() -> Generator[R]:
            for opt in map(func, self._inner):
                if opt.is_none():
                    return
                yield opt.unwrap()

        return Iter(_gen())

    def repeat(self, n: int | None = None) -> Iter[Self]:
        """Repeat the entire `Iter` **n** times (as elements).

        If **n** is `None`, repeat indefinitely.

        Operates lazily, hence if you need to get the underlying elements, you will need to collect each repeated `Iter` via `.map(lambda x: x.collect())` or similar.

        Warning:
            If **n** is `None`, this will create an infinite `Iterator`.

            Be sure to use `Iter.take()` or `Iter.slice()` to limit the number of items taken.

        See Also:
            `Iter.cycle()` to repeat the *elements* of the `Iter` indefinitely.

        Args:
            n (int | None): Optional number of repetitions.

        Returns:
            Iter[Self]: An `Iter` of repeated `Iter`.

        Example:
        ```python
        >>> import pyochain as pc
        >>> pc.Iter([1, 2]).repeat(3).map(list).collect()
        Seq([1, 2], [1, 2], [1, 2])

        ```
        """
        new = self.__class__

        def _repeat_infinite() -> Generator[Self]:
            tee = functools.partial(itertools.tee, self._inner, 1)
            iterators = tee()
            while True:
                yield new(iterators[0])
                iterators = tee()

        if n is None:
            return Iter(_repeat_infinite())
        return Iter(map(new, itertools.tee(self._inner, n)))

    def scan[U](self, initial: U, func: Callable[[U, T], Option[U]]) -> Iter[U]:
        """Transform elements by sharing state between iterations.

        `scan` takes two arguments:
            - an **initial** value which seeds the internal state
            - a **func** with two arguments

        The first being a reference to the internal state and the second an iterator element.

        The **func** can assign to the internal state to share state between iterations.

        On iteration, the **func** will be applied to each element of the iterator and the return value from the func, an Option, is returned by the next method.

        Thus the **func** can return `Some(value)` to yield value, or `NONE` to end the iteration.

        Args:
            initial (U): Initial state.
            func (Callable[[U, T], Option[U]]): Function that takes the current state and an item, and returns an Option.

        Returns:
            Iter[U]: An iterable of the yielded values.

        Example:
        ```python
        >>> import pyochain as pc
        >>> def accumulate_until_limit(state: int, item: int) -> pc.Option[int]:
        ...     new_state = state + item
        ...     match new_state:
        ...         case _ if new_state <= 10:
        ...             return pc.Some(new_state)
        ...         case _:
        ...             return pc.NONE
        >>> pc.Iter([1, 2, 3, 4, 5]).scan(0, accumulate_until_limit).collect()
        Seq(1, 3, 6, 10)

        ```
        """

        def _gen(data: Iterable[T]) -> Iterator[U]:
            current: U = initial
            for item in data:
                res = func(current, item)
                if res.is_none():
                    break
                current = res.unwrap()
                yield res.unwrap()

        return Iter(_gen(self._inner))

    # filters ------------------------------------------------------------
    @overload
    def filter[U](self, func: Callable[[T], TypeIs[U]]) -> Iter[U]: ...
    @overload
    def filter(self, func: Callable[[T], bool]) -> Iter[T]: ...
    def filter[U](self, func: Callable[[T], bool | TypeIs[U]]) -> Iter[T] | Iter[U]:
        """Creates an `Iter` which uses a closure to determine if an element should be yielded.

        Given an element the closure must return true or false.

        The returned `Iter` will yield only the elements for which the closure returns true.

        The closure can return a `TypeIs` to narrow the type of the returned iterable.

        This won't have any runtime effect, but allows for better type inference.

        Note:
            `Iter.filter(f).next()` is equivalent to `Iter.find(f)`.

        Args:
            func (Callable[[T], bool | TypeIs[U]]): Function to evaluate each item.

        Returns:
            Iter[T] | Iter[U]: An iterable of the items that satisfy the predicate.

        Example:
        ```python
        >>> import pyochain as pc
        >>> data = (1, 2, 3)
        >>> pc.Iter(data).filter(lambda x: x > 1).collect()
        Seq(2, 3)
        >>> # See the equivalence of next and find:
        >>> pc.Iter(data).filter(lambda x: x > 1).next()
        Some(2)
        >>> pc.Iter(data).find(lambda x: x > 1)
        Some(2)
        >>> # Using TypeIs to narrow type:
        >>> from typing import TypeIs
        >>> def _is_str(x: object) -> TypeIs[str]:
        ...     return isinstance(x, str)
        >>> mixed_data = [1, "two", 3.0, "four"]
        >>> pc.Iter(mixed_data).filter(_is_str).collect()
        Seq('two', 'four')

        ```
        """
        return Iter(filter(func, self._inner))

    @overload
    def filter_star(
        self: Iter[tuple[Any]],
        func: Callable[[Any], bool],
    ) -> Iter[tuple[Any]]: ...
    @overload
    def filter_star[T1, T2](
        self: Iter[tuple[T1, T2]],
        func: Callable[[T1, T2], bool],
    ) -> Iter[tuple[T1, T2]]: ...
    @overload
    def filter_star[T1, T2, T3](
        self: Iter[tuple[T1, T2, T3]],
        func: Callable[[T1, T2, T3], bool],
    ) -> Iter[tuple[T1, T2, T3]]: ...
    @overload
    def filter_star[T1, T2, T3, T4](
        self: Iter[tuple[T1, T2, T3, T4]],
        func: Callable[[T1, T2, T3, T4], bool],
    ) -> Iter[tuple[T1, T2, T3, T4]]: ...
    @overload
    def filter_star[T1, T2, T3, T4, T5](
        self: Iter[tuple[T1, T2, T3, T4, T5]],
        func: Callable[[T1, T2, T3, T4, T5], bool],
    ) -> Iter[tuple[T1, T2, T3, T4, T5]]: ...
    @overload
    def filter_star[T1, T2, T3, T4, T5, T6](
        self: Iter[tuple[T1, T2, T3, T4, T5, T6]],
        func: Callable[[T1, T2, T3, T4, T5, T6], bool],
    ) -> Iter[tuple[T1, T2, T3, T4, T5, T6]]: ...
    @overload
    def filter_star[T1, T2, T3, T4, T5, T6, T7](
        self: Iter[tuple[T1, T2, T3, T4, T5, T6, T7]],
        func: Callable[[T1, T2, T3, T4, T5, T6, T7], bool],
    ) -> Iter[tuple[T1, T2, T3, T4, T5, T6, T7]]: ...
    @overload
    def filter_star[T1, T2, T3, T4, T5, T6, T7, T8](
        self: Iter[tuple[T1, T2, T3, T4, T5, T6, T7, T8]],
        func: Callable[[T1, T2, T3, T4, T5, T6, T7, T8], bool],
    ) -> Iter[tuple[T1, T2, T3, T4, T5, T6, T7, T8]]: ...
    @overload
    def filter_star[T1, T2, T3, T4, T5, T6, T7, T8, T9](
        self: Iter[tuple[T1, T2, T3, T4, T5, T6, T7, T8, T9]],
        func: Callable[[T1, T2, T3, T4, T5, T6, T7, T8, T9], bool],
    ) -> Iter[tuple[T1, T2, T3, T4, T5, T6, T7, T8, T9]]: ...
    @overload
    def filter_star[T1, T2, T3, T4, T5, T6, T7, T8, T9, T10](
        self: Iter[tuple[T1, T2, T3, T4, T5, T6, T7, T8, T9, T10]],
        func: Callable[[T1, T2, T3, T4, T5, T6, T7, T8, T9, T10], bool],
    ) -> Iter[tuple[T1, T2, T3, T4, T5, T6, T7, T8, T9, T10]]: ...

    def filter_star[U: Iterable[Any]](
        self: Iter[U],
        func: Callable[..., bool],
    ) -> Iter[U]:
        """Creates an `Iter` which uses a closure **func** to determine if an element should be yielded, where each element is an iterable.

        Unlike `.filter()`, which passes each element as a single argument, `.filter_star()` unpacks each element into positional arguments for the **func**.

        In short, for each element in the `Iter`, it computes `func(*element)`.

        This is useful after using methods like `.zip()`, `.product()`, or `.enumerate()` that yield tuples.

        Args:
            func (Callable[..., bool]): Function to evaluate unpacked elements.

        Returns:
            Iter[U]: An `Iter` of the items that satisfy the predicate.

        Example:
        ```python
        >>> import pyochain as pc
        >>> data = pc.Seq(["apple", "banana", "cherry", "date"])
        >>> data.iter().enumerate().filter_star(lambda index, fruit: index % 2 == 0).map_star(lambda index, fruit: fruit.title()).collect()
        Seq('Apple', 'Cherry')

        ```
        """
        return Iter(filter(lambda x: func(*x), self._inner))

    @overload
    def filter_false[U](self, func: Callable[[T], TypeIs[U]]) -> Iter[U]: ...
    @overload
    def filter_false(self, func: Callable[[T], bool]) -> Iter[T]: ...
    def filter_false[U](
        self, func: Callable[[T], bool | TypeIs[U]]
    ) -> Iter[T] | Iter[U]:
        """Return elements for which **func** is `False`.

        The **func** can return a `TypeIs` to narrow the type of the returned `Iter`.

        This won't have any runtime effect, but allows for better type inference.

        Args:
            func (Callable[[T], bool | TypeIs[U]]): Function to evaluate each item.

        Returns:
            Iter[T] | Iter[U]: An `Iter` of the items that do not satisfy the predicate.

        Example:
        ```python
        >>> import pyochain as pc
        >>> pc.Iter([1, 2, 3]).filter_false(lambda x: x > 1).collect()
        Seq(1,)

        ```
        """
        return Iter(itertools.filterfalse(func, self._inner))

    def filter_map[R](self, func: Callable[[T], Option[R]]) -> Iter[R]:
        """Creates an iterator that both filters and maps.

        The returned iterator yields only the values for which the supplied closure returns Some(value).

        `filter_map` can be used to make chains of `filter` and map more concise.

        The example below shows how a `map().filter().map()` can be shortened to a single call to `filter_map`.

        Args:
            func (Callable[[T], Option[R]]): Function to apply to each item.

        Returns:
            Iter[R]: An iterable of the results where func returned `Some`.

        Example:
        ```python
        >>> import pyochain as pc
        >>> def _parse(s: str) -> pc.Result[int, str]:
        ...     try:
        ...         return pc.Ok(int(s))
        ...     except ValueError:
        ...         return pc.Err(f"Invalid integer, got {s!r}")
        >>>
        >>> data = pc.Seq(["1", "two", "NaN", "four", "5"])
        >>> data.iter().filter_map(lambda s: _parse(s).ok()).collect()
        Seq(1, 5)
        >>> # Equivalent to:
        >>> (
        ...     data.iter()
        ...    .map(lambda s: _parse(s).ok())
        ...    .filter(lambda s: s.is_some())
        ...    .map(lambda s: s.unwrap())
        ...    .collect()
        ... )
        Seq(1, 5)

        ```
        """

        def _filter_map(data: Iterable[T]) -> Iterator[R]:
            for item in data:
                res = func(item)
                if res.is_some():
                    yield res.unwrap()

        return Iter(_filter_map(self._inner))

    @overload
    def filter_map_star[R](
        self: Iter[tuple[Any]],
        func: Callable[[Any], Option[R]],
    ) -> Iter[R]: ...
    @overload
    def filter_map_star[T1, T2, R](
        self: Iter[tuple[T1, T2]],
        func: Callable[[T1, T2], Option[R]],
    ) -> Iter[R]: ...
    @overload
    def filter_map_star[T1, T2, T3, R](
        self: Iter[tuple[T1, T2, T3]],
        func: Callable[[T1, T2, T3], Option[R]],
    ) -> Iter[R]: ...
    @overload
    def filter_map_star[T1, T2, T3, T4, R](
        self: Iter[tuple[T1, T2, T3, T4]],
        func: Callable[[T1, T2, T3, T4], Option[R]],
    ) -> Iter[R]: ...
    @overload
    def filter_map_star[T1, T2, T3, T4, T5, R](
        self: Iter[tuple[T1, T2, T3, T4, T5]],
        func: Callable[[T1, T2, T3, T4, T5], Option[R]],
    ) -> Iter[R]: ...
    @overload
    def filter_map_star[T1, T2, T3, T4, T5, T6, R](
        self: Iter[tuple[T1, T2, T3, T4, T5, T6]],
        func: Callable[[T1, T2, T3, T4, T5, T6], Option[R]],
    ) -> Iter[R]: ...
    @overload
    def filter_map_star[T1, T2, T3, T4, T5, T6, T7, R](
        self: Iter[tuple[T1, T2, T3, T4, T5, T6, T7]],
        func: Callable[[T1, T2, T3, T4, T5, T6, T7], Option[R]],
    ) -> Iter[R]: ...
    @overload
    def filter_map_star[T1, T2, T3, T4, T5, T6, T7, T8, R](
        self: Iter[tuple[T1, T2, T3, T4, T5, T6, T7, T8]],
        func: Callable[[T1, T2, T3, T4, T5, T6, T7, T8], Option[R]],
    ) -> Iter[R]: ...
    @overload
    def filter_map_star[T1, T2, T3, T4, T5, T6, T7, T8, T9, R](
        self: Iter[tuple[T1, T2, T3, T4, T5, T6, T7, T8, T9]],
        func: Callable[[T1, T2, T3, T4, T5, T6, T7, T8, T9], Option[R]],
    ) -> Iter[R]: ...
    @overload
    def filter_map_star[T1, T2, T3, T4, T5, T6, T7, T8, T9, T10, R](
        self: Iter[tuple[T1, T2, T3, T4, T5, T6, T7, T8, T9, T10]],
        func: Callable[[T1, T2, T3, T4, T5, T6, T7, T8, T9, T10], Option[R]],
    ) -> Iter[R]: ...
    def filter_map_star[U: Iterable[Any], R](
        self: Iter[U],
        func: Callable[..., Option[R]],
    ) -> Iter[R]:
        """Creates an iterator that both filters and maps, where each element is an iterable.

        Unlike `.filter_map()`, which passes each element as a single argument, `.filter_map_star()` unpacks each element into positional arguments for the function.

        In short, for each `element` in the sequence, it computes `func(*element)`.

        This is useful after using methods like `zip`, `product`, or `enumerate` that yield tuples.

        Args:
            func (Callable[..., Option[R]]): Function to apply to unpacked elements.

        Returns:
            Iter[R]: An iterable of the results where func returned `Some`.

        Example:
        ```python
        >>> import pyochain as pc
        >>> data = pc.Seq([("1", "10"), ("two", "20"), ("3", "thirty")])
        >>> def _parse_pair(s1: str, s2: str) -> pc.Result[tuple[int, int], str]:
        ...     try:
        ...         return pc.Ok((int(s1), int(s2)))
        ...     except ValueError:
        ...         return pc.Err(f"Invalid integer pair: {s1!r}, {s2!r}")
        >>>
        >>> data.iter().filter_map_star(lambda s1, s2: _parse_pair(s1, s2).ok()).collect()
        Seq((1, 10),)

        ```
        """

        def _filter_map_star(data: Iterable[U]) -> Iterator[R]:
            for item in data:
                res = func(*item)
                if res.is_some():
                    yield res.unwrap()

        return Iter(_filter_map_star(self._inner))

    # joins and zips ------------------------------------------------------------
    @overload
    def zip[T1](
        self,
        iter1: Iterable[T1],
        /,
        *,
        strict: bool = ...,
    ) -> Iter[tuple[T, T1]]: ...
    @overload
    def zip[T1, T2](
        self,
        iter1: Iterable[T1],
        iter2: Iterable[T2],
        /,
        *,
        strict: bool = ...,
    ) -> Iter[tuple[T, T1, T2]]: ...
    @overload
    def zip[T1, T2, T3](
        self,
        iter1: Iterable[T1],
        iter2: Iterable[T2],
        iter3: Iterable[T3],
        /,
        *,
        strict: bool = ...,
    ) -> Iter[tuple[T, T1, T2, T3]]: ...
    @overload
    def zip[T1, T2, T3, T4](
        self,
        iter1: Iterable[T1],
        iter2: Iterable[T2],
        iter3: Iterable[T3],
        iter4: Iterable[T4],
        /,
        *,
        strict: bool = ...,
    ) -> Iter[tuple[T, T1, T2, T3, T4]]: ...
    def zip(
        self,
        *others: Iterable[Any],
        strict: bool = False,
    ) -> Iter[tuple[Any, ...]]:
        """Yields n-length tuples, where n is the number of iterables passed as positional arguments.

        The i-th element in every tuple comes from the i-th iterable argument to `.zip()`.

        This continues until the shortest argument is exhausted.

        Note:
            `Iter.map_star` can then be used for subsequent operations on the index and value, in a destructuring manner.
            This keep the code clean and readable, without index access like `[0]` and `[1]` for inline lambdas.

        Args:
            *others (Iterable[Any]): Other iterables to zip with.
            strict (bool): If `True` and one of the arguments is exhausted before the others, raise a ValueError.

        Returns:
            Iter[tuple[Any, ...]]: An `Iter` of tuples containing elements from the zipped Iter and other iterables.

        Example:
        ```python
        >>> import pyochain as pc
        >>> pc.Iter([1, 2]).zip([10, 20]).collect()
        Seq((1, 10), (2, 20))
        >>> pc.Iter(["a", "b"]).zip([1, 2, 3]).collect()
        Seq(('a', 1), ('b', 2))

        ```
        """
        return Iter(zip(self._inner, *others, strict=strict))

    @overload
    def zip_longest[T2](
        self, iter2: Iterable[T2], /
    ) -> Iter[tuple[Option[T], Option[T2]]]: ...
    @overload
    def zip_longest[T2, T3](
        self, iter2: Iterable[T2], iter3: Iterable[T3], /
    ) -> Iter[tuple[Option[T], Option[T2], Option[T3]]]: ...
    @overload
    def zip_longest[T2, T3, T4](
        self,
        iter2: Iterable[T2],
        iter3: Iterable[T3],
        iter4: Iterable[T4],
        /,
    ) -> Iter[tuple[Option[T], Option[T2], Option[T3], Option[T4]]]: ...
    @overload
    def zip_longest[T2, T3, T4, T5](
        self,
        iter2: Iterable[T2],
        iter3: Iterable[T3],
        iter4: Iterable[T4],
        iter5: Iterable[T5],
        /,
    ) -> Iter[
        tuple[
            Option[T],
            Option[T2],
            Option[T3],
            Option[T4],
            Option[T5],
        ]
    ]: ...
    @overload
    def zip_longest(
        self,
        iter2: Iterable[T],
        iter3: Iterable[T],
        iter4: Iterable[T],
        iter5: Iterable[T],
        iter6: Iterable[T],
        /,
        *iterables: Iterable[T],
    ) -> Iter[tuple[Option[T], ...]]: ...
    def zip_longest(self, *others: Iterable[Any]) -> Iter[tuple[Option[Any], ...]]:
        """Return a zip Iterator who yield a tuple where the i-th element comes from the i-th iterable argument.

        Yield values until the longest iterable in the argument sequence is exhausted, and then it raises StopIteration.

        The longest iterable determines the length of the returned iterator, and will return `Some[T]` until exhaustion.

        When the shorter iterables are exhausted, they yield `NONE`.

        Args:
            *others (Iterable[Any]): Other iterables to zip with.

        Returns:
            Iter[tuple[Option[Any], ...]]: An iterable of tuples containing optional elements from the zipped iterables.

        Example:
        ```python
        >>> import pyochain as pc
        >>> pc.Iter([1, 2]).zip_longest([10]).collect()
        Seq((Some(1), Some(10)), (Some(2), NONE))
        >>> # Can be combined with try collect to filter out the NONE:
        >>> pc.Iter([1, 2]).zip_longest([10]).map(lambda x: pc.Iter(x).try_collect()).collect()
        Seq(Some(Vec(1, 10)), NONE)

        ```
        """
        return Iter(
            tuple(Option(t) for t in tup)
            for tup in itertools.zip_longest(self._inner, *others, fillvalue=None)
        )

    def unzip[U, V](self: Iter[tuple[U, V]]) -> Unzipped[U, V]:
        """Converts an iterator of pairs into a pair of iterators.

        Returns:
            Unzipped[U, V]: dataclass with first and second iterators.


        Returns an `Unzipped` dataclass, containing two iterators:

        - one from the left elements of the pairs
        - one from the right elements.

        This function is, in some sense, the opposite of `.zip()`.

        Note:
            Both iterators share the same underlying source.

            Values consumed by one iterator remain in the shared buffer until the other iterator consumes them too.

            This is the unavoidable cost of having two independent iterators over the same source.

        ```python
        >>> import pyochain as pc
        >>> data = [(1, "a"), (2, "b"), (3, "c")]
        >>> unzipped = pc.Iter(data).unzip()
        >>> unzipped.left.collect()
        Seq(1, 2, 3)
        >>> unzipped.right.collect()
        Seq('a', 'b', 'c')

        ```
        """
        left, right = itertools.tee(self._inner, 2)
        return Unzipped(Iter(x[0] for x in left), Iter(x[1] for x in right))

    def cloned(self) -> Self:
        """Clone the `Iter` into a new independent `Iter` using `itertools.tee`.

        After calling this method, the original `Iter` will continue to yield elements independently of the cloned one.

        Note:
            Values consumed by one iterator remain in the shared buffer until the other iterator consumes them too.

            This is the unavoidable cost of having two independent iterators over the same source.

            However, once both iterators have passed a value, it's freed from memory.

        Returns:
            Self: A new independent cloned iterator.

        Example:
        ```python
        >>> import pyochain as pc
        >>> it = pc.Iter([1, 2, 3])
        >>> cloned = it.cloned()
        >>> cloned.collect()
        Seq(1, 2, 3)
        >>> it.collect()
        Seq(1, 2, 3)

        ```
        """
        it1, it2 = itertools.tee(self._inner)
        self._inner = it1
        return self.__class__(it2)

    @overload
    def product(self) -> Iter[tuple[T]]: ...
    @overload
    def product[T1](self, iter1: Iterable[T1], /) -> Iter[tuple[T, T1]]: ...
    @overload
    def product[T1, T2](
        self,
        iter1: Iterable[T1],
        iter2: Iterable[T2],
        /,
    ) -> Iter[tuple[T, T1, T2]]: ...
    @overload
    def product[T1, T2, T3](
        self,
        iter1: Iterable[T1],
        iter2: Iterable[T2],
        iter3: Iterable[T3],
        /,
    ) -> Iter[tuple[T, T1, T2, T3]]: ...
    @overload
    def product[T1, T2, T3, T4](
        self,
        iter1: Iterable[T1],
        iter2: Iterable[T2],
        iter3: Iterable[T3],
        iter4: Iterable[T4],
        /,
    ) -> Iter[tuple[T, T1, T2, T3, T4]]: ...

    def product(self, *others: Iterable[Any]) -> Iter[tuple[Any, ...]]:
        """Computes the Cartesian product with another iterable.

        This is the declarative equivalent of nested for-loops.

        It pairs every element from the source iterable with every element from the
        other iterable.

        Args:
            *others (Iterable[Any]): Other iterables to compute the Cartesian product with.

        Returns:
            Iter[tuple[Any, ...]]: An iterable of tuples containing elements from the Cartesian product.

        Example:
        ```python
        >>> import pyochain as pc
        >>> pc.Iter(["blue", "red"]).product(["S", "M"]).collect()
        Seq(('blue', 'S'), ('blue', 'M'), ('red', 'S'), ('red', 'M'))
        >>> res = (
        ...     pc.Iter(["blue", "red"])
        ...     .product(["S", "M"])
        ...     .map_star(lambda color, size: f"{color}-{size}")
        ...     .collect()
        ... )
        >>> res
        Seq('blue-S', 'blue-M', 'red-S', 'red-M')
        >>> res = (
        ...     pc.Iter([1, 2, 3])
        ...     .product([10, 20])
        ...     .filter_star(lambda a, b: a * b >= 40)
        ...     .map_star(lambda a, b: a * b)
        ...     .collect()
        ... )
        >>> res
        Seq(40, 60)
        >>> res = (
        ...     pc.Iter([1])
        ...     .product(["a", "b"], [True])
        ...     .filter_star(lambda _a, b, _c: b != "a")
        ...     .map_star(lambda a, b, c: f"{a}{b} is {c}")
        ...     .collect()
        ... )
        >>> res
        Seq('1b is True',)

        ```
        """
        return Iter(itertools.product(self._inner, *others))

    def diff_at[R](
        self, other: Iterable[T], key: Callable[[T], R] | None = None
    ) -> Iter[tuple[Option[T], Option[T]]]:
        """Yields pairs of differing elements from two iterables.

        Compares elements from the source iterable and another iterable at corresponding positions.

        If elements differ (based on equality or a provided key function), yields a tuple containing the differing elements wrapped in `Option`.

        If one iterable is shorter, yields `NONE` for missing elements.

        Args:
            other (Iterable[T]): Other `Iterable` to compare with.
            key (Callable[[T], R] | None): Function to apply to each item for comparison.

        Returns:
            Iter[tuple[Option[T], Option[T]]]: An `Iter` of item pairs containing differing elements.

        Example:
        ```python
        >>> import pyochain as pc
        >>> data = pc.Seq([1, 2, 3])
        >>> data.iter().diff_at([1, 2, 10, 100]).collect()
        Seq((Some(3), Some(10)), (NONE, Some(100)))
        >>> data.iter().diff_at([1, 2, 10, 100, 2, 6, 7]).collect() # doctest: +NORMALIZE_WHITESPACE
        Seq((Some(3), Some(10)),
        (NONE, Some(100)),
        (NONE, Some(2)),
        (NONE, Some(6)),
        (NONE, Some(7)))
        >>> pc.Iter(["apples", "bananas"]).diff_at(["Apples", "Oranges"], key=str.lower).collect(list)
        [(Some('bananas'), Some('Oranges'))]

        ```
        """
        if key is None:

            def _gen_no_key() -> Iterator[tuple[Option[T], Option[T]]]:
                for first, second in itertools.zip_longest(
                    map(Some, self), map(Some, other), fillvalue=NONE
                ):
                    if first.ne(second):
                        yield first, second

            return Iter(_gen_no_key())

        def _gen_with_key() -> Iterator[tuple[Option[T], Option[T]]]:
            for first, second in itertools.zip_longest(
                map(Some, self), map(Some, other), fillvalue=NONE
            ):
                if first.map(key).ne(second.map(key)):
                    yield first, second

        return Iter(_gen_with_key())

    @overload
    def map_windows[R](
        self, length: Literal[1], func: Callable[[tuple[T]], R]
    ) -> Iter[R]: ...
    @overload
    def map_windows[R](
        self, length: Literal[2], func: Callable[[tuple[T, T]], R]
    ) -> Iter[R]: ...
    @overload
    def map_windows[R](
        self, length: Literal[3], func: Callable[[tuple[T, T, T]], R]
    ) -> Iter[R]: ...
    @overload
    def map_windows[R](
        self, length: Literal[4], func: Callable[[tuple[T, T, T, T]], R]
    ) -> Iter[R]: ...
    @overload
    def map_windows[R](
        self, length: Literal[5], func: Callable[[tuple[T, T, T, T, T]], R]
    ) -> Iter[R]: ...
    @overload
    def map_windows[R](
        self, length: Literal[6], func: Callable[[tuple[T, T, T, T, T, T]], R]
    ) -> Iter[R]: ...
    @overload
    def map_windows[R](
        self, length: Literal[7], func: Callable[[tuple[T, T, T, T, T, T, T]], R]
    ) -> Iter[R]: ...
    @overload
    def map_windows[R](
        self, length: Literal[8], func: Callable[[tuple[T, T, T, T, T, T, T, T]], R]
    ) -> Iter[R]: ...
    @overload
    def map_windows[R](
        self, length: Literal[9], func: Callable[[tuple[T, T, T, T, T, T, T, T, T]], R]
    ) -> Iter[R]: ...
    @overload
    def map_windows[R](
        self,
        length: Literal[10],
        func: Callable[[tuple[T, T, T, T, T, T, T, T, T, T]], R],
    ) -> Iter[R]: ...
    @overload
    def map_windows[R](
        self, length: int, func: Callable[[tuple[T, ...]], R]
    ) -> Iter[R]: ...
    def map_windows[R](
        self, length: int, func: Callable[[tuple[Any, ...]], R]
    ) -> Iter[R]:
        r"""Calls the given *func* for each contiguous window of size *length* over **self**.

        The windows during mapping overlaps.

        The provided function is called with the entire window as a single tuple argument.

        Args:
            length (int): The length of each window.
            func (Callable[[tuple[Any, ...]], R]): Function to apply to each window.

        Returns:
            Iter[R]: An iterator over the outputs of func.

        See Also:
            `.map_windows_star()` for a version that unpacks the window into separate arguments.

        Example:
        ```python
        >>> import pyochain as pc
        >>> import statistics
        >>> pc.Iter([1, 2, 3, 4]).map_windows(2, statistics.mean).collect()
        Seq(1.5, 2.5, 3.5)
        >>> pc.Iter("abcd").map_windows(3, lambda window: "".join(window).upper()).collect()
        Seq('ABC', 'BCD')
        >>> pc.Iter([10, 20, 30, 40, 50]).map_windows(4, sum).collect()
        Seq(100, 140)
        >>> from pathlib import Path
        >>> pc.Iter(["home", "src", "pyochain"]).map_windows(2, lambda p: str(Path(*p))).collect()
        Seq('home\\src', 'src\\pyochain')


        ```
        """
        return Iter(map(func, cz.itertoolz.sliding_window(length, self._inner)))

    @overload
    def map_windows_star[R](
        self, length: Literal[1], func: Callable[[T], R]
    ) -> Iter[R]: ...
    @overload
    def map_windows_star[R](
        self, length: Literal[2], func: Callable[[T, T], R]
    ) -> Iter[R]: ...
    @overload
    def map_windows_star[R](
        self, length: Literal[3], func: Callable[[T, T, T], R]
    ) -> Iter[R]: ...
    @overload
    def map_windows_star[R](
        self, length: Literal[4], func: Callable[[T, T, T, T], R]
    ) -> Iter[R]: ...
    @overload
    def map_windows_star[R](
        self, length: Literal[5], func: Callable[[T, T, T, T, T], R]
    ) -> Iter[R]: ...
    @overload
    def map_windows_star[R](
        self, length: Literal[6], func: Callable[[T, T, T, T, T, T], R]
    ) -> Iter[R]: ...
    @overload
    def map_windows_star[R](
        self, length: Literal[7], func: Callable[[T, T, T, T, T, T, T], R]
    ) -> Iter[R]: ...
    @overload
    def map_windows_star[R](
        self, length: Literal[8], func: Callable[[T, T, T, T, T, T, T, T], R]
    ) -> Iter[R]: ...
    @overload
    def map_windows_star[R](
        self, length: Literal[9], func: Callable[[T, T, T, T, T, T, T, T, T], R]
    ) -> Iter[R]: ...
    @overload
    def map_windows_star[R](
        self, length: Literal[10], func: Callable[[T, T, T, T, T, T, T, T, T, T], R]
    ) -> Iter[R]: ...
    def map_windows_star[R](self, length: int, func: Callable[..., R]) -> Iter[R]:
        """Calls the given *func* for each contiguous window of size *length* over **self**.

        The windows during mapping overlaps.

        The provided function is called with each element of the window as separate arguments.

        Args:
            length (int): The length of each window.
            func (Callable[..., R]): Function to apply to each window.

        Returns:
            Iter[R]: An iterator over the outputs of func.

        See Also:
            `.map_windows()` for a version that passes the entire window as a single tuple argument.

        Example:
        ```python
        >>> import pyochain as pc
        >>> pc.Iter("abcd").map_windows_star(2, lambda x, y: f"{x}+{y}").collect()
        Seq('a+b', 'b+c', 'c+d')
        >>> pc.Iter([1, 2, 3, 4]).map_windows_star(2, lambda x, y: x + y).collect()
        Seq(3, 5, 7)

        ```
        """
        return Iter(
            itertools.starmap(func, cz.itertoolz.sliding_window(length, self._inner))
        )

    @overload
    def partition(self, n: Literal[1], pad: None = None) -> Iter[tuple[T]]: ...
    @overload
    def partition(self, n: Literal[2], pad: None = None) -> Iter[tuple[T, T]]: ...
    @overload
    def partition(self, n: Literal[3], pad: None = None) -> Iter[tuple[T, T, T]]: ...
    @overload
    def partition(self, n: Literal[4], pad: None = None) -> Iter[tuple[T, T, T, T]]: ...
    @overload
    def partition(
        self,
        n: Literal[5],
        pad: None = None,
    ) -> Iter[tuple[T, T, T, T, T]]: ...
    @overload
    def partition(self, n: int, pad: T) -> Iter[tuple[T, ...]]: ...
    def partition(self, n: int, pad: T | None = None) -> Iter[tuple[T, ...]]:
        """Partition **self** into `tuples` of length **n**.

        Args:
            n (int): Length of each partition.
            pad (T | None): Value to pad the last partition if needed.

        Returns:
            Iter[tuple[T, ...]]: An iterable of partitioned tuples.

        Example:
        ```python
        >>> import pyochain as pc
        >>> pc.Iter([1, 2, 3, 4]).partition(2).collect()
        Seq((1, 2), (3, 4))

        ```
        If the length of seq is not evenly divisible by n, the final tuple is dropped if pad is not specified, or filled to length n by pad:
        ```python
        >>> pc.Iter([1, 2, 3, 4, 5]).partition(2).collect()
        Seq((1, 2), (3, 4), (5, None))

        ```
        """
        return Iter(cz.itertoolz.partition(n, self._inner, pad=pad))

    def partition_all(self, n: int) -> Iter[tuple[T, ...]]:
        """Partition all elements of sequence into tuples of length at most n.

        The final tuple may be shorter to accommodate extra elements.

        Args:
            n (int): Maximum length of each partition.

        Returns:
            Iter[tuple[T, ...]]: An iterable of partitioned tuples.

        Example:
        ```python
        >>> import pyochain as pc
        >>> pc.Iter([1, 2, 3, 4]).partition_all(2).collect()
        Seq((1, 2), (3, 4))
        >>> pc.Iter([1, 2, 3, 4, 5]).partition_all(2).collect()
        Seq((1, 2), (3, 4), (5,))

        ```
        """
        return Iter(cz.itertoolz.partition_all(n, self._inner))

    def partition_by(self, predicate: Callable[[T], bool]) -> Iter[tuple[T, ...]]:
        """Partition the `Iterator` into a sequence of `tuples` according to a predicate function.

        Every time the output of `predicate` changes, a new `tuple` is started,
        and subsequent items are collected into that `tuple`.

        Args:
            predicate (Callable[[T], bool]): Function to determine partition boundaries.

        Returns:
            Iter[tuple[T, ...]]: An iterable of partitioned tuples.

        Example:
        ```python
        >>> import pyochain as pc
        >>> pc.Iter("I have space").partition_by(lambda c: c == " ").collect()
        Seq(('I',), (' ',), ('h', 'a', 'v', 'e'), (' ',), ('s', 'p', 'a', 'c', 'e'))
        >>>
        >>> data = [1, 2, 1, 99, 88, 33, 99, -1, 5]
        >>> pc.Iter(data).partition_by(lambda x: x > 10).collect()
        Seq((1, 2, 1), (99, 88, 33, 99), (-1, 5))

        ```
        """
        return Iter(cz.recipes.partitionby(predicate, self._inner))

    def batch(self, n: int, *, strict: bool = False) -> Iter[tuple[T, ...]]:
        """Batch elements into tuples of length n and return a new Iter.

        - The last batch may be shorter than n.
        - The data is consumed lazily, just enough to fill a batch.
        - The result is yielded as soon as a batch is full or when the input iterable is exhausted.

        Args:
            n (int): Number of elements in each batch.
            strict (bool): If `True`, raises a ValueError if the last batch is not of length n.

        Returns:
            Iter[tuple[T, ...]]: An iterable of batched tuples.

        Example:
        ```python
        >>> import pyochain as pc
        >>> pc.Iter("ABCDEFG").batch(3).collect()
        Seq(('A', 'B', 'C'), ('D', 'E', 'F'), ('G',))

        ```
        """
        return Iter(itertools.batched(self._inner, n, strict=strict))

    def peekable(self, n: int) -> Peekable[T]:
        """Retrieve the next **n** elements from the `Iterator`, and return a `Seq` of the retrieved elements along with the original `Iterator`, unconsumed.

        The returned `Peekable` object contains two attributes:
        - *peek*: A `Seq` of the next **n** elements.
        - *values*: An `Iter` that includes the peeked elements followed by the remaining elements of the original `Iterator`.

        `Peekable` implement `Checkable` on the *peek* attribute.

        Args:
            n (int): Number of items to peek.

        Returns:
            Peekable[T]: A `Peekable` object containing the peeked elements and the remaining iterator.

        See Also:
            `Iter.cloned()` to create an independent copy of the iterator.

        Example:
        ```python
        >>> import pyochain as pc
        >>> data = pc.Iter([1, 2, 3]).peekable(2)
        >>> data.peek
        Seq(1, 2)
        >>> data.values.collect()
        Seq(1, 2, 3)

        ```
        """
        peeked = Seq(itertools.islice(self._inner, n))
        return Peekable(peeked, Iter(itertools.chain(peeked, self._inner)))

    def is_strictly_n(self, n: int) -> Iter[Result[T, ValueError]]:
        """Yield`Ok[T]` as long as the iterable has exactly *n* items.

        If it has fewer than *n* items, yield `Err[ValueError]` with the actual number of items.

        If it has more than *n* items, yield `Err[ValueError]` with the number `n + 1`.

        Note that the returned iterable must be consumed in order for the check to
        be made.

        Args:
            n (int): The exact number of items expected.

        Returns:
            Iter[Result[T, ValueError]]: A new Iterable wrapper yielding results based on the item count.

        Example:
        ```python
        >>> import pyochain as pc
        >>> data = ["a", "b", "c", "d"]
        >>> n = 4
        >>> pc.Iter(data).is_strictly_n(n).collect()
        Seq(Ok('a'), Ok('b'), Ok('c'), Ok('d'))
        >>> pc.Iter("ab").is_strictly_n(3).collect()  # doctest: +NORMALIZE_WHITESPACE
        Seq(Ok('a'), Ok('b'),
        Err(ValueError('Too few items in iterable (got 2)')))
        >>> pc.Iter("abc").is_strictly_n(2).collect()  # doctest: +NORMALIZE_WHITESPACE
        Seq(Ok('a'), Ok('b'),
        Err(ValueError('Too many items in iterable (got at least 3)')))

        ```
        You can easily combine this with `.map(lambda r: r.map_err(...))` to handle the errors as you wish.
        ```python
        >>> def _my_err(e: ValueError) -> str:
        ...     return f"custom error: {e}"
        >>>
        >>> pc.Iter([1]).is_strictly_n(0).map(lambda r: r.map_err(_my_err)).collect()
        Seq(Err('custom error: Too many items in iterable (got at least 1)'),)

        ```
        Or use `.filter_map(...)` to only keep the `Ok` values.
        ```python
        >>> pc.Iter([1, 2, 3]).is_strictly_n(2).filter_map(lambda r: r.ok()).collect()
        Seq(1, 2)

        ```
        """

        def _strictly_n_(data: Iterator[T]) -> Iterator[Result[T, ValueError]]:
            sent = 0
            for item in itertools.islice(data, n):
                yield Ok(item)
                sent += 1

            if sent < n:
                e = ValueError(f"Too few items in iterable (got {sent})")
                yield Err(e)

            for _ in data:
                e = ValueError(f"Too many items in iterable (got at least {n + 1})")
                yield Err(e)

        return Iter(_strictly_n_(self._inner))

    def enumerate(self, start: int = 0) -> Iter[tuple[int, T]]:
        """Return a `Iter` of (index, value) pairs.

        Each value in the `Iter` is paired with its index, starting from 0.

        Tip:
            `Iter.map_star` can then be used for subsequent operations on the index and value, in a destructuring manner.
            This keep the code clean and readable, without index access like `[0]` and `[1]` for inline lambdas.

        Args:
            start (int): The starting index.

        Returns:
            Iter[tuple[int, T]]: An `Iter` of (index, value) pairs.

        Example:
        ```python
        >>> import pyochain as pc
        >>> pc.Iter(["a", "b"]).enumerate().collect()
        Seq((0, 'a'), (1, 'b'))
        >>> pc.Iter(["a", "b"]).enumerate().map_star(lambda idx, val: (idx, val.upper())).collect()
        Seq((0, 'A'), (1, 'B'))

        ```
        """
        return Iter(enumerate(self._inner, start))

    @overload
    def combinations(self, r: Literal[2]) -> Iter[tuple[T, T]]: ...
    @overload
    def combinations(self, r: Literal[3]) -> Iter[tuple[T, T, T]]: ...
    @overload
    def combinations(self, r: Literal[4]) -> Iter[tuple[T, T, T, T]]: ...
    @overload
    def combinations(self, r: Literal[5]) -> Iter[tuple[T, T, T, T, T]]: ...
    def combinations(self, r: int) -> Iter[tuple[T, ...]]:
        """Return all combinations of length r.

        Args:
            r (int): Length of each combination.

        Returns:
            Iter[tuple[T, ...]]: An iterable of combinations.

        Example:
        ```python
        >>> import pyochain as pc
        >>> pc.Iter([1, 2, 3]).combinations(2).collect()
        Seq((1, 2), (1, 3), (2, 3))

        ```
        """
        return Iter(itertools.combinations(self._inner, r))

    @overload
    def permutations(self, r: Literal[2]) -> Iter[tuple[T, T]]: ...
    @overload
    def permutations(self, r: Literal[3]) -> Iter[tuple[T, T, T]]: ...
    @overload
    def permutations(self, r: Literal[4]) -> Iter[tuple[T, T, T, T]]: ...
    @overload
    def permutations(self, r: Literal[5]) -> Iter[tuple[T, T, T, T, T]]: ...
    def permutations(self, r: int | None = None) -> Iter[tuple[T, ...]]:
        """Return all permutations of length r.

        Args:
            r (int | None): Length of each permutation. Defaults to the length of the iterable.

        Returns:
            Iter[tuple[T, ...]]: An iterable of permutations.

        Example:
        ```python
        >>> import pyochain as pc
        >>> pc.Iter([1, 2, 3]).permutations(2).collect()
        Seq((1, 2), (1, 3), (2, 1), (2, 3), (3, 1), (3, 2))

        ```
        """
        return Iter(itertools.permutations(self._inner, r))

    @overload
    def combinations_with_replacement(self, r: Literal[2]) -> Iter[tuple[T, T]]: ...
    @overload
    def combinations_with_replacement(self, r: Literal[3]) -> Iter[tuple[T, T, T]]: ...
    @overload
    def combinations_with_replacement(
        self,
        r: Literal[4],
    ) -> Iter[tuple[T, T, T, T]]: ...
    @overload
    def combinations_with_replacement(
        self,
        r: Literal[5],
    ) -> Iter[tuple[T, T, T, T, T]]: ...
    def combinations_with_replacement(self, r: int) -> Iter[tuple[T, ...]]:
        """Return all combinations with replacement of length r.

        Args:
            r (int): Length of each combination.

        Returns:
            Iter[tuple[T, ...]]: An iterable of combinations with replacement.

        Example:
        ```python
        >>> import pyochain as pc
        >>> pc.Iter([1, 2, 3]).combinations_with_replacement(2).collect()
        Seq((1, 1), (1, 2), (1, 3), (2, 2), (2, 3), (3, 3))

        ```
        """
        return Iter(itertools.combinations_with_replacement(self._inner, r))

    def pairwise(self) -> Iter[tuple[T, T]]:
        """Return an iterator over pairs of consecutive elements.

        Returns:
            Iter[tuple[T, T]]: An iterable of pairs of consecutive elements.

        Example:
        ```python
        >>> import pyochain as pc
        >>> pc.Iter([1, 2, 3]).pairwise().collect()
        Seq((1, 2), (2, 3))

        ```
        """
        return Iter(itertools.pairwise(self._inner))

    @overload
    def map_juxt[R1, R2](
        self,
        func1: Callable[[T], R1],
        func2: Callable[[T], R2],
        /,
    ) -> Iter[tuple[R1, R2]]: ...
    @overload
    def map_juxt[R1, R2, R3](
        self,
        func1: Callable[[T], R1],
        func2: Callable[[T], R2],
        func3: Callable[[T], R3],
        /,
    ) -> Iter[tuple[R1, R2, R3]]: ...
    @overload
    def map_juxt[R1, R2, R3, R4](
        self,
        func1: Callable[[T], R1],
        func2: Callable[[T], R2],
        func3: Callable[[T], R3],
        func4: Callable[[T], R4],
        /,
    ) -> Iter[tuple[R1, R2, R3, R4]]: ...
    def map_juxt(self, *funcs: Callable[[T], object]) -> Iter[tuple[object, ...]]:
        """Apply several functions to each item.

        Returns a new Iter where each item is a tuple of the results of applying each function to the original item.

        Args:
            *funcs (Callable[[T], object]): Functions to apply to each item.

        Returns:
            Iter[tuple[object, ...]]: An iterable of tuples containing the results of each function.
        ```python
        >>> import pyochain as pc
        >>> def is_even(n: int) -> bool:
        ...     return n % 2 == 0
        >>> def is_positive(n: int) -> bool:
        ...     return n > 0
        >>>
        >>> pc.Iter([1, -2, 3]).map_juxt(is_even, is_positive).collect()
        Seq((False, True), (True, False), (False, True))

        ```
        """
        return Iter(map(cz.functoolz.juxt(*funcs), self._inner))

    def with_position(self) -> Iter[tuple[Position, T]]:
        """Return an iterable over (`Position`, `T`) tuples.

        The `Position` indicates whether the item `T` is the first, middle, last, or only element in the iterable.

        Returns:
            Iter[tuple[Position, T]]: An iterable of (`Position`, item) tuples.

        Example:
        ```python
        >>> import pyochain as pc
        >>> pc.Iter(["a", "b", "c"]).with_position().collect()
        Seq(('first', 'a'), ('middle', 'b'), ('last', 'c'))
        >>> pc.Iter(["a"]).with_position().collect()
        Seq(('only', 'a'),)

        ```
        """

        def _gen(data: Iterator[T]) -> Iterator[tuple[Position, T]]:
            try:
                first = next(data)
            except StopIteration:
                return

            try:
                second = next(data)
            except StopIteration:
                yield ("only", first)
                return
            yield ("first", first)

            current: T = second
            for nxt in self._inner:
                yield ("middle", current)
                current = nxt
            yield ("last", current)

        return Iter(_gen(self._inner))

    @overload
    def group_by(self, key: None = None) -> Iter[tuple[T, Self]]: ...
    @overload
    def group_by[K](self, key: Callable[[T], K]) -> Iter[tuple[K, Self]]: ...
    @overload
    def group_by[K](
        self, key: Callable[[T], K] | None = None
    ) -> Iter[tuple[K, Self] | tuple[T, Self]]: ...
    def group_by(
        self, key: Callable[[T], Any] | None = None
    ) -> Iter[tuple[Any | T, Self]]:
        """Make an `Iter` that returns consecutive keys and groups from the iterable.

        Args:
            key (Callable[[T], Any] | None): Function computing a key value for each element..
        If not specified or is None, **key** defaults to an identity function and returns the element unchanged.

        Returns:
            Iter[tuple[Any | T, Self]]: An `Iter` of `(key, value)` tuples.

        The values yielded are `(K, Self)` tuples, where the first element is the group key and the second element is an `Iter` of type `T` over the group values.

        The `Iter` needs to already be sorted on the same key function.

        This is due to the fact that it generates a new `Group` every time the value of the **key** function changes.

        That behavior differs from SQL's `GROUP BY` which aggregates common elements regardless of their input order.

        Warning:
            You must materialize the second element of the tuple immediately when iterating over groups.

            Because `.group_by()` uses Python's `itertools.groupby` under the hood, each group's iterator shares internal state.

            When you advance to the next group, the previous group's iterator becomes invalid and will yield empty results.

        Example:
        ```python
        >>> import pyochain as pc
        >>> # Example 1: Group even and odd numbers
        >>> (
        ... pc.Iter.from_count() # create an infinite iterator of integers
        ... .take(8) # take the first 8
        ... .map(lambda x: (x % 2 == 0, x)) # map to (is_even, value)
        ... .sort(key=lambda x: x[0]) # sort by is_even
        ... .iter() # Since sort collect to a Vec, we need to convert back to Iter
        ... .group_by(lambda x: x[0]) # group by is_even
        ... .map_star(lambda g, vals: (g, vals.map_star(lambda _, y: y).into(list))) # extract values from groups, discarding keys, and materializing them to lists
        ... .collect() # collect the result
        ... .into(dict) # convert to dict
        ... )
        {False: [1, 3, 5, 7], True: [0, 2, 4, 6]}
        >>> # Example 2: Group by a common key, already sorted
        >>> data = [
        ...     {"name": "Alice", "gender": "F"},
        ...     {"name": "Bob", "gender": "M"},
        ...     {"name": "Charlie", "gender": "M"},
        ...     {"name": "Dan", "gender": "M"},
        ... ]
        >>> (
        ... pc.Iter(data)
        ... .group_by(lambda x: x["gender"]) # group by the gender key
        ... .map_star(lambda g, vals: (g, vals.length())) # get the length of each group
        ... .collect()
        ... )
        Seq(('F', 1), ('M', 3))
        >>> # Example 3: Incorrect usage with LATE materialization:
        >>> groups = pc.Iter(["a1", "a2", "b1"]).group_by(lambda x: x[0]).collect()
        >>> # Now iterate - TOO LATE! The group iterators are consumed
        >>> for g in groups:
        ...     print(g[1].collect())  # ❌ Empty!
        Seq()
        Seq()
        >>> # Example 4: Correct usage with intermediate materialization:
        >>> groups = (
        ...     pc.Iter(["a1", "a2", "b1"])
        ...     .group_by(lambda x: x[0])
        ...     .map_star(lambda g, vals: (g, vals.collect()))  # ✅ Materialize NOW
        ...     .collect()
        ...     .iter()
        ...     .for_each(lambda x: print(f"{x[0]}: {x[1]}"))
        ... )
        a: Seq('a1', 'a2')
        b: Seq('b1',)

        ```
        """
        new = self.__class__
        return Iter((x, new(y)) for x, y in itertools.groupby(self._inner, key))

    @overload
    def sort[U: SupportsRichComparison[Any]](
        self: Iter[U],
        *,
        key: None = None,
        reverse: bool = False,
    ) -> Vec[U]: ...
    @overload
    def sort(
        self,
        *,
        key: Callable[[T], SupportsRichComparison[Any]],
        reverse: bool = False,
    ) -> Vec[T]: ...
    @overload
    def sort(
        self,
        *,
        key: None = None,
        reverse: bool = False,
    ) -> Never: ...
    def sort(
        self,
        *,
        key: Callable[[T], SupportsRichComparison[Any]] | None = None,
        reverse: bool = False,
    ) -> Vec[Any]:
        """Sort the elements of the sequence.

        If a key function is provided, it is used to extract a comparison key from each element.

        Note:
            This method must consume the entire `Iter` to perform the sort.
            The result is a new `Vec` over the sorted sequence.

        Args:
            key (Callable[[T], SupportsRichComparison[Any]] | None): Function to extract a comparison key from each element.
            reverse (bool): Whether to sort in descending order.

        Returns:
            Vec[Any]: A `Vec` with elements sorted.

        Example:
        ```python
        >>> import pyochain as pc
        >>> pc.Iter([3, 1, 2]).sort()
        Vec(1, 2, 3)

        ```
        """
        return Vec.from_ref(sorted(self._inner, reverse=reverse, key=key))

    def tail(self, n: int) -> Seq[T]:
        """Return a `Seq` of the last **n** elements of the `Iterator`.

        Args:
            n (int): Number of elements to return.

        Returns:
            Seq[T]: A `Seq` containing the last **n** elements.

        Example:
        ```python
        >>> import pyochain as pc
        >>> pc.Iter([1, 2, 3]).tail(2)
        Seq(2, 3)

        ```
        """
        return Seq(cz.itertoolz.tail(n, self._inner))

    def top_n(self, n: int, key: Callable[[T], Any] | None = None) -> Seq[T]:
        """Return a tuple of the top-n items according to key.

        Args:
            n (int): Number of top elements to return.
            key (Callable[[T], Any] | None): Function to extract a comparison key from each element.

        Returns:
            Seq[T]: A new Seq containing the top-n elements.

        Example:
        ```python
        >>> import pyochain as pc
        >>> pc.Iter([1, 3, 2]).top_n(2)
        Seq(3, 2)

        ```
        """
        return Seq(cz.itertoolz.topk(n, self._inner, key=key))

    def most_common(self, n: int | None = None) -> Vec[tuple[T, int]]:
        """Return the **n** most common elements and their counts from the `Iterator`.

        If **n** is `None`, then all elements are returned.

        Args:
            n (int | None): Number of most common elements to return. Defaults to None (all elements).

        Returns:
            Vec[tuple[T, int]]: A `Vec` containing tuples of (element, count).

        Example:
        ```python
        >>> import pyochain as pc
        >>> pc.Iter([1, 1, 2, 3, 3, 3]).most_common(2)
        Vec((3, 3), (1, 2))

        ```
        """
        from collections import Counter

        return Vec.from_ref(Counter(self._inner).most_common(n))

__bool__()

Check if the Iterator has at least one element (mutates self).

After calling this, the Iterator still contains all elements.

Returns:

Name Type Description
bool bool

True if the Iterator has at least one element, False otherwise.

Examples:

>>> import pyochain as pc
>>> it = pc.Iter([1, 2, 3])
>>> bool(it)
True
>>> it.collect()  # All elements still available
Seq(1, 2, 3)

Source code in src/pyochain/_iter.py
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
def __bool__(self) -> bool:
    """Check if the `Iterator` has at least one element (mutates **self**).

    After calling this, the `Iterator` still contains all elements.

    Returns:
        bool: True if the `Iterator` has at least one element, False otherwise.

    Examples:
    ```python
    >>> import pyochain as pc
    >>> it = pc.Iter([1, 2, 3])
    >>> bool(it)
    True
    >>> it.collect()  # All elements still available
    Seq(1, 2, 3)

    ```
    """
    first = tuple(itertools.islice(self._inner, 1))
    self._inner = itertools.chain(first, self._inner)
    return len(first) > 0

array_chunks(size)

Yield subiterators (chunks) that each yield a fixed number elements, determined by size.

The last chunk will be shorter if there are not enough elements.

Parameters:

Name Type Description Default
size int

Number of elements in each chunk.

required

Returns:

Type Description
Iter[Self]

Iter[Self]: An iterable of iterators, each yielding n elements.

If the sub-iterables are read in order, the elements of iterable won't be stored in memory.

If they are read out of order, :func:itertools.tee is used to cache elements as necessary.

>>> import pyochain as pc
>>> all_chunks = pc.Iter.from_count().array_chunks(4)
>>> c_1, c_2, c_3 = all_chunks.next(), all_chunks.next(), all_chunks.next()
>>> c_2.unwrap().collect()  # c_1's elements have been cached; c_3's haven't been
Seq(4, 5, 6, 7)
>>> c_1.unwrap().collect()
Seq(0, 1, 2, 3)
>>> c_3.unwrap().collect()
Seq(8, 9, 10, 11)
>>> pc.Seq([1, 2, 3, 4, 5, 6]).iter().array_chunks(3).map(lambda c: c.collect()).collect()
Seq(Seq(1, 2, 3), Seq(4, 5, 6))
>>> pc.Seq([1, 2, 3, 4, 5, 6, 7, 8]).iter().array_chunks(3).map(lambda c: c.collect()).collect()
Seq(Seq(1, 2, 3), Seq(4, 5, 6), Seq(7, 8))

Source code in src/pyochain/_iter.py
 989
 990
 991
 992
 993
 994
 995
 996
 997
 998
 999
1000
1001
1002
1003
1004
1005
1006
1007
1008
1009
1010
1011
1012
1013
1014
1015
1016
1017
1018
1019
1020
1021
1022
1023
1024
1025
1026
1027
1028
1029
1030
1031
1032
1033
1034
1035
1036
1037
1038
1039
1040
1041
1042
1043
1044
1045
1046
1047
1048
1049
1050
1051
1052
1053
1054
1055
1056
1057
1058
1059
1060
1061
1062
1063
1064
def array_chunks(self, size: int) -> Iter[Self]:
    """Yield subiterators (chunks) that each yield a fixed number elements, determined by size.

    The last chunk will be shorter if there are not enough elements.

    Args:
        size (int): Number of elements in each chunk.

    Returns:
        Iter[Self]: An iterable of iterators, each yielding n elements.

    If the sub-iterables are read in order, the elements of *iterable*
    won't be stored in memory.

    If they are read out of order, :func:`itertools.tee` is used to cache
    elements as necessary.
    ```python
    >>> import pyochain as pc
    >>> all_chunks = pc.Iter.from_count().array_chunks(4)
    >>> c_1, c_2, c_3 = all_chunks.next(), all_chunks.next(), all_chunks.next()
    >>> c_2.unwrap().collect()  # c_1's elements have been cached; c_3's haven't been
    Seq(4, 5, 6, 7)
    >>> c_1.unwrap().collect()
    Seq(0, 1, 2, 3)
    >>> c_3.unwrap().collect()
    Seq(8, 9, 10, 11)
    >>> pc.Seq([1, 2, 3, 4, 5, 6]).iter().array_chunks(3).map(lambda c: c.collect()).collect()
    Seq(Seq(1, 2, 3), Seq(4, 5, 6))
    >>> pc.Seq([1, 2, 3, 4, 5, 6, 7, 8]).iter().array_chunks(3).map(lambda c: c.collect()).collect()
    Seq(Seq(1, 2, 3), Seq(4, 5, 6), Seq(7, 8))

    ```
    """
    from collections import deque
    from contextlib import suppress

    def _chunks() -> Iterator[Self]:
        def _ichunk(
            iterator: Iterator[T], n: int
        ) -> tuple[Iterator[T], Callable[[int], int]]:
            cache: deque[T] = deque()
            chunk = itertools.islice(iterator, n)

            def _generator() -> Iterator[T]:
                with suppress(StopIteration):
                    while True:
                        if cache:
                            yield cache.popleft()
                        else:
                            yield next(chunk)

            def _materialize_next(n: int) -> int:
                to_cache = n - len(cache)

                # materialize up to n
                if to_cache > 0:
                    cache.extend(itertools.islice(chunk, to_cache))

                # return number materialized up to n
                return min(n, len(cache))

            return (_generator(), _materialize_next)

        new = self.__class__
        while True:
            # Create new chunk
            chunk, _materialize_next = _ichunk(self._inner, size)

            # Check to see whether we're at the end of the source iterable
            if not _materialize_next(size):
                return

            yield new(chunk)
            _materialize_next(size)

    return Iter(_chunks())

batch(n, *, strict=False)

Batch elements into tuples of length n and return a new Iter.

  • The last batch may be shorter than n.
  • The data is consumed lazily, just enough to fill a batch.
  • The result is yielded as soon as a batch is full or when the input iterable is exhausted.

Parameters:

Name Type Description Default
n int

Number of elements in each batch.

required
strict bool

If True, raises a ValueError if the last batch is not of length n.

False

Returns:

Type Description
Iter[tuple[T, ...]]

Iter[tuple[T, ...]]: An iterable of batched tuples.

Example:

>>> import pyochain as pc
>>> pc.Iter("ABCDEFG").batch(3).collect()
Seq(('A', 'B', 'C'), ('D', 'E', 'F'), ('G',))

Source code in src/pyochain/_iter.py
2699
2700
2701
2702
2703
2704
2705
2706
2707
2708
2709
2710
2711
2712
2713
2714
2715
2716
2717
2718
2719
2720
2721
def batch(self, n: int, *, strict: bool = False) -> Iter[tuple[T, ...]]:
    """Batch elements into tuples of length n and return a new Iter.

    - The last batch may be shorter than n.
    - The data is consumed lazily, just enough to fill a batch.
    - The result is yielded as soon as a batch is full or when the input iterable is exhausted.

    Args:
        n (int): Number of elements in each batch.
        strict (bool): If `True`, raises a ValueError if the last batch is not of length n.

    Returns:
        Iter[tuple[T, ...]]: An iterable of batched tuples.

    Example:
    ```python
    >>> import pyochain as pc
    >>> pc.Iter("ABCDEFG").batch(3).collect()
    Seq(('A', 'B', 'C'), ('D', 'E', 'F'), ('G',))

    ```
    """
    return Iter(itertools.batched(self._inner, n, strict=strict))

cloned()

Clone the Iter into a new independent Iter using itertools.tee.

After calling this method, the original Iter will continue to yield elements independently of the cloned one.

Note

Values consumed by one iterator remain in the shared buffer until the other iterator consumes them too.

This is the unavoidable cost of having two independent iterators over the same source.

However, once both iterators have passed a value, it's freed from memory.

Returns:

Name Type Description
Self Self

A new independent cloned iterator.

Example:

>>> import pyochain as pc
>>> it = pc.Iter([1, 2, 3])
>>> cloned = it.cloned()
>>> cloned.collect()
Seq(1, 2, 3)
>>> it.collect()
Seq(1, 2, 3)

Source code in src/pyochain/_iter.py
2288
2289
2290
2291
2292
2293
2294
2295
2296
2297
2298
2299
2300
2301
2302
2303
2304
2305
2306
2307
2308
2309
2310
2311
2312
2313
2314
2315
2316
2317
def cloned(self) -> Self:
    """Clone the `Iter` into a new independent `Iter` using `itertools.tee`.

    After calling this method, the original `Iter` will continue to yield elements independently of the cloned one.

    Note:
        Values consumed by one iterator remain in the shared buffer until the other iterator consumes them too.

        This is the unavoidable cost of having two independent iterators over the same source.

        However, once both iterators have passed a value, it's freed from memory.

    Returns:
        Self: A new independent cloned iterator.

    Example:
    ```python
    >>> import pyochain as pc
    >>> it = pc.Iter([1, 2, 3])
    >>> cloned = it.cloned()
    >>> cloned.collect()
    Seq(1, 2, 3)
    >>> it.collect()
    Seq(1, 2, 3)

    ```
    """
    it1, it2 = itertools.tee(self._inner)
    self._inner = it1
    return self.__class__(it2)

collect(collector=Seq[T])

Transforms an Iter into a collection.

The most basic pattern in which collect() is used is to turn one collection into another.

You take a collection, call iter() on it, do a bunch of transformations, and then collect() at the end.

You can specify the target collection type by providing a collector function or type.

This can be any Callable that takes an Iterator[T] and returns a Collection[T] of those types.

Note

This can be tought as .into() with a default value (Seq[T]), and a different constraint (Collection[Any]). However, the runtime behavior is identical in both cases: pass self to the provided function, return the result.

Parameters:

Name Type Description Default
collector Callable[[Iterator[T]], R]

Function|type that defines the target collection. R is constrained to a Collection.

Seq[T]

Returns:

Name Type Description
R R

A materialized collection containing the collected elements.

Example:

>>> import pyochain as pc
>>> pc.Iter(range(5)).collect()
Seq(0, 1, 2, 3, 4)
>>> iterator = pc.Iter((1, 2, 3))
>>> iterator._inner.__class__.__name__
'tuple_iterator'
>>> mapped = iterator.map(lambda x: x * 2)
>>> mapped._inner.__class__.__name__
'map'
>>> mapped.collect()
Seq(2, 4, 6)
>>> # iterator is now exhausted
>>> iterator.collect()
Seq()
>>> pc.Iter(range(5)).collect(list)
[0, 1, 2, 3, 4]
>>> pc.Iter(range(5)).collect(pc.Vec)
Vec(0, 1, 2, 3, 4)
>>> iterator = pc.Iter([1, 2, 3])
>>> iterator._inner.__class__.__name__
'list_iterator'

Source code in src/pyochain/_iter.py
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
def collect[R: Collection[Any]](
    self, collector: Callable[[Iterator[T]], R] = Seq[T]
) -> R:
    """Transforms an `Iter` into a collection.

    The most basic pattern in which collect() is used is to turn one collection into another.

    You take a collection, call `iter()` on it, do a bunch of transformations, and then `collect()` at the end.

    You can specify the target collection type by providing a **collector** function or type.

    This can be any `Callable` that takes an `Iterator[T]` and returns a `Collection[T]` of those types.

    Note:
        This can be tought as `.into()` with a default value (`Seq[T]`), and a different constraint (`Collection[Any]`).
        However, the runtime behavior is identical in both cases: pass **self** to the provided function, return the result.

    Args:
        collector (Callable[[Iterator[T]], R]): Function|type that defines the target collection. `R` is constrained to a `Collection`.

    Returns:
        R: A materialized collection containing the collected elements.

    Example:
    ```python
    >>> import pyochain as pc
    >>> pc.Iter(range(5)).collect()
    Seq(0, 1, 2, 3, 4)
    >>> iterator = pc.Iter((1, 2, 3))
    >>> iterator._inner.__class__.__name__
    'tuple_iterator'
    >>> mapped = iterator.map(lambda x: x * 2)
    >>> mapped._inner.__class__.__name__
    'map'
    >>> mapped.collect()
    Seq(2, 4, 6)
    >>> # iterator is now exhausted
    >>> iterator.collect()
    Seq()
    >>> pc.Iter(range(5)).collect(list)
    [0, 1, 2, 3, 4]
    >>> pc.Iter(range(5)).collect(pc.Vec)
    Vec(0, 1, 2, 3, 4)
    >>> iterator = pc.Iter([1, 2, 3])
    >>> iterator._inner.__class__.__name__
    'list_iterator'

    ```
    """
    return collector(self._inner)

collect_into(collection)

collect_into(collection: Vec[T]) -> Vec[T]
collect_into(collection: list[T]) -> list[T]

Collects all the items from the Iterator into a MutableSequence.

This method consumes the Iterator and adds all its items to the passed MutableSequence.

The MutableSequence is then returned, so the call chain can be continued.

This is useful when you already have a MutableSequence and want to add the Iterator items to it.

This method is a convenience method to call MutableSequence.extend(), but instead of being called on a MutableSequence, it's called on an Iterator.

Parameters:

Name Type Description Default
collection MutableSequence[T]

A mutable collection to collect items into.

required

Returns:

Type Description
MutableSequence[T]

MutableSequence[T]: The same mutable collection passed as argument, now containing the collected items.

Example: Basic usage:

>>> import pyochain as pc
>>> a = pc.Seq([1, 2, 3])
>>> vec = pc.Vec([0, 1])
>>> a.iter().map(lambda x: x * 2).collect_into(vec)
Vec(0, 1, 2, 4, 6)
>>> a.iter().map(lambda x: x * 10).collect_into(vec)
Vec(0, 1, 2, 4, 6, 10, 20, 30)
The returned mutable sequence can be used to continue the call chain:
>>> import pyochain as pc
>>> a = pc.Seq([1, 2, 3])
>>> vec = pc.Vec[int].new()
>>> a.iter().collect_into(vec).length() == vec.length()
True
>>> a.iter().collect_into(vec).length() == vec.length()
True

Source code in src/pyochain/_iter.py
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
def collect_into(self, collection: MutableSequence[T]) -> MutableSequence[T]:
    """Collects all the items from the `Iterator` into a `MutableSequence`.

    This method consumes the `Iterator` and adds all its items to the passed `MutableSequence`.

    The `MutableSequence` is then returned, so the call chain can be continued.

    This is useful when you already have a `MutableSequence` and want to add the `Iterator` items to it.

    This method is a convenience method to call `MutableSequence.extend()`, but instead of being called on a `MutableSequence`, it's called on an `Iterator`.

    Args:
        collection (MutableSequence[T]): A mutable collection to collect items into.

    Returns:
        MutableSequence[T]: The same mutable collection passed as argument, now containing the collected items.

    Example:
    Basic usage:
    ```python
    >>> import pyochain as pc
    >>> a = pc.Seq([1, 2, 3])
    >>> vec = pc.Vec([0, 1])
    >>> a.iter().map(lambda x: x * 2).collect_into(vec)
    Vec(0, 1, 2, 4, 6)
    >>> a.iter().map(lambda x: x * 10).collect_into(vec)
    Vec(0, 1, 2, 4, 6, 10, 20, 30)

    ```
    The returned mutable sequence can be used to continue the call chain:
    ```python
    >>> import pyochain as pc
    >>> a = pc.Seq([1, 2, 3])
    >>> vec = pc.Vec[int].new()
    >>> a.iter().collect_into(vec).length() == vec.length()
    True
    >>> a.iter().collect_into(vec).length() == vec.length()
    True

    ```
    """
    collection.extend(self._inner)
    return collection

combinations(r)

combinations(r: Literal[2]) -> Iter[tuple[T, T]]
combinations(r: Literal[3]) -> Iter[tuple[T, T, T]]
combinations(r: Literal[4]) -> Iter[tuple[T, T, T, T]]
combinations(r: Literal[5]) -> Iter[tuple[T, T, T, T, T]]

Return all combinations of length r.

Parameters:

Name Type Description Default
r int

Length of each combination.

required

Returns:

Type Description
Iter[tuple[T, ...]]

Iter[tuple[T, ...]]: An iterable of combinations.

Example:

>>> import pyochain as pc
>>> pc.Iter([1, 2, 3]).combinations(2).collect()
Seq((1, 2), (1, 3), (2, 3))

Source code in src/pyochain/_iter.py
2854
2855
2856
2857
2858
2859
2860
2861
2862
2863
2864
2865
2866
2867
2868
2869
2870
2871
def combinations(self, r: int) -> Iter[tuple[T, ...]]:
    """Return all combinations of length r.

    Args:
        r (int): Length of each combination.

    Returns:
        Iter[tuple[T, ...]]: An iterable of combinations.

    Example:
    ```python
    >>> import pyochain as pc
    >>> pc.Iter([1, 2, 3]).combinations(2).collect()
    Seq((1, 2), (1, 3), (2, 3))

    ```
    """
    return Iter(itertools.combinations(self._inner, r))

combinations_with_replacement(r)

combinations_with_replacement(
    r: Literal[2],
) -> Iter[tuple[T, T]]
combinations_with_replacement(
    r: Literal[3],
) -> Iter[tuple[T, T, T]]
combinations_with_replacement(
    r: Literal[4],
) -> Iter[tuple[T, T, T, T]]
combinations_with_replacement(
    r: Literal[5],
) -> Iter[tuple[T, T, T, T, T]]

Return all combinations with replacement of length r.

Parameters:

Name Type Description Default
r int

Length of each combination.

required

Returns:

Type Description
Iter[tuple[T, ...]]

Iter[tuple[T, ...]]: An iterable of combinations with replacement.

Example:

>>> import pyochain as pc
>>> pc.Iter([1, 2, 3]).combinations_with_replacement(2).collect()
Seq((1, 1), (1, 2), (1, 3), (2, 2), (2, 3), (3, 3))

Source code in src/pyochain/_iter.py
2914
2915
2916
2917
2918
2919
2920
2921
2922
2923
2924
2925
2926
2927
2928
2929
2930
2931
def combinations_with_replacement(self, r: int) -> Iter[tuple[T, ...]]:
    """Return all combinations with replacement of length r.

    Args:
        r (int): Length of each combination.

    Returns:
        Iter[tuple[T, ...]]: An iterable of combinations with replacement.

    Example:
    ```python
    >>> import pyochain as pc
    >>> pc.Iter([1, 2, 3]).combinations_with_replacement(2).collect()
    Seq((1, 1), (1, 2), (1, 3), (2, 2), (2, 3), (3, 3))

    ```
    """
    return Iter(itertools.combinations_with_replacement(self._inner, r))

diff_at(other, key=None)

Yields pairs of differing elements from two iterables.

Compares elements from the source iterable and another iterable at corresponding positions.

If elements differ (based on equality or a provided key function), yields a tuple containing the differing elements wrapped in Option.

If one iterable is shorter, yields NONE for missing elements.

Parameters:

Name Type Description Default
other Iterable[T]

Other Iterable to compare with.

required
key Callable[[T], R] | None

Function to apply to each item for comparison.

None

Returns:

Type Description
Iter[tuple[Option[T], Option[T]]]

Iter[tuple[Option[T], Option[T]]]: An Iter of item pairs containing differing elements.

Example:

>>> import pyochain as pc
>>> data = pc.Seq([1, 2, 3])
>>> data.iter().diff_at([1, 2, 10, 100]).collect()
Seq((Some(3), Some(10)), (NONE, Some(100)))
>>> data.iter().diff_at([1, 2, 10, 100, 2, 6, 7]).collect() # doctest: +NORMALIZE_WHITESPACE
Seq((Some(3), Some(10)),
(NONE, Some(100)),
(NONE, Some(2)),
(NONE, Some(6)),
(NONE, Some(7)))
>>> pc.Iter(["apples", "bananas"]).diff_at(["Apples", "Oranges"], key=str.lower).collect(list)
[(Some('bananas'), Some('Oranges'))]

Source code in src/pyochain/_iter.py
2398
2399
2400
2401
2402
2403
2404
2405
2406
2407
2408
2409
2410
2411
2412
2413
2414
2415
2416
2417
2418
2419
2420
2421
2422
2423
2424
2425
2426
2427
2428
2429
2430
2431
2432
2433
2434
2435
2436
2437
2438
2439
2440
2441
2442
2443
2444
2445
2446
2447
2448
2449
2450
2451
def diff_at[R](
    self, other: Iterable[T], key: Callable[[T], R] | None = None
) -> Iter[tuple[Option[T], Option[T]]]:
    """Yields pairs of differing elements from two iterables.

    Compares elements from the source iterable and another iterable at corresponding positions.

    If elements differ (based on equality or a provided key function), yields a tuple containing the differing elements wrapped in `Option`.

    If one iterable is shorter, yields `NONE` for missing elements.

    Args:
        other (Iterable[T]): Other `Iterable` to compare with.
        key (Callable[[T], R] | None): Function to apply to each item for comparison.

    Returns:
        Iter[tuple[Option[T], Option[T]]]: An `Iter` of item pairs containing differing elements.

    Example:
    ```python
    >>> import pyochain as pc
    >>> data = pc.Seq([1, 2, 3])
    >>> data.iter().diff_at([1, 2, 10, 100]).collect()
    Seq((Some(3), Some(10)), (NONE, Some(100)))
    >>> data.iter().diff_at([1, 2, 10, 100, 2, 6, 7]).collect() # doctest: +NORMALIZE_WHITESPACE
    Seq((Some(3), Some(10)),
    (NONE, Some(100)),
    (NONE, Some(2)),
    (NONE, Some(6)),
    (NONE, Some(7)))
    >>> pc.Iter(["apples", "bananas"]).diff_at(["Apples", "Oranges"], key=str.lower).collect(list)
    [(Some('bananas'), Some('Oranges'))]

    ```
    """
    if key is None:

        def _gen_no_key() -> Iterator[tuple[Option[T], Option[T]]]:
            for first, second in itertools.zip_longest(
                map(Some, self), map(Some, other), fillvalue=NONE
            ):
                if first.ne(second):
                    yield first, second

        return Iter(_gen_no_key())

    def _gen_with_key() -> Iterator[tuple[Option[T], Option[T]]]:
        for first, second in itertools.zip_longest(
            map(Some, self), map(Some, other), fillvalue=NONE
        ):
            if first.map(key).ne(second.map(key)):
                yield first, second

    return Iter(_gen_with_key())

enumerate(start=0)

Return a Iter of (index, value) pairs.

Each value in the Iter is paired with its index, starting from 0.

Tip

Iter.map_star can then be used for subsequent operations on the index and value, in a destructuring manner. This keep the code clean and readable, without index access like [0] and [1] for inline lambdas.

Parameters:

Name Type Description Default
start int

The starting index.

0

Returns:

Type Description
Iter[tuple[int, T]]

Iter[tuple[int, T]]: An Iter of (index, value) pairs.

Example:

>>> import pyochain as pc
>>> pc.Iter(["a", "b"]).enumerate().collect()
Seq((0, 'a'), (1, 'b'))
>>> pc.Iter(["a", "b"]).enumerate().map_star(lambda idx, val: (idx, val.upper())).collect()
Seq((0, 'A'), (1, 'B'))

Source code in src/pyochain/_iter.py
2819
2820
2821
2822
2823
2824
2825
2826
2827
2828
2829
2830
2831
2832
2833
2834
2835
2836
2837
2838
2839
2840
2841
2842
2843
2844
def enumerate(self, start: int = 0) -> Iter[tuple[int, T]]:
    """Return a `Iter` of (index, value) pairs.

    Each value in the `Iter` is paired with its index, starting from 0.

    Tip:
        `Iter.map_star` can then be used for subsequent operations on the index and value, in a destructuring manner.
        This keep the code clean and readable, without index access like `[0]` and `[1]` for inline lambdas.

    Args:
        start (int): The starting index.

    Returns:
        Iter[tuple[int, T]]: An `Iter` of (index, value) pairs.

    Example:
    ```python
    >>> import pyochain as pc
    >>> pc.Iter(["a", "b"]).enumerate().collect()
    Seq((0, 'a'), (1, 'b'))
    >>> pc.Iter(["a", "b"]).enumerate().map_star(lambda idx, val: (idx, val.upper())).collect()
    Seq((0, 'A'), (1, 'B'))

    ```
    """
    return Iter(enumerate(self._inner, start))

filter(func)

filter(func: Callable[[T], TypeIs[U]]) -> Iter[U]
filter(func: Callable[[T], bool]) -> Iter[T]

Creates an Iter which uses a closure to determine if an element should be yielded.

Given an element the closure must return true or false.

The returned Iter will yield only the elements for which the closure returns true.

The closure can return a TypeIs to narrow the type of the returned iterable.

This won't have any runtime effect, but allows for better type inference.

Note

Iter.filter(f).next() is equivalent to Iter.find(f).

Parameters:

Name Type Description Default
func Callable[[T], bool | TypeIs[U]]

Function to evaluate each item.

required

Returns:

Type Description
Iter[T] | Iter[U]

Iter[T] | Iter[U]: An iterable of the items that satisfy the predicate.

Example:

>>> import pyochain as pc
>>> data = (1, 2, 3)
>>> pc.Iter(data).filter(lambda x: x > 1).collect()
Seq(2, 3)
>>> # See the equivalence of next and find:
>>> pc.Iter(data).filter(lambda x: x > 1).next()
Some(2)
>>> pc.Iter(data).find(lambda x: x > 1)
Some(2)
>>> # Using TypeIs to narrow type:
>>> from typing import TypeIs
>>> def _is_str(x: object) -> TypeIs[str]:
...     return isinstance(x, str)
>>> mixed_data = [1, "two", 3.0, "four"]
>>> pc.Iter(mixed_data).filter(_is_str).collect()
Seq('two', 'four')

Source code in src/pyochain/_iter.py
1813
1814
1815
1816
1817
1818
1819
1820
1821
1822
1823
1824
1825
1826
1827
1828
1829
1830
1831
1832
1833
1834
1835
1836
1837
1838
1839
1840
1841
1842
1843
1844
1845
1846
1847
1848
1849
1850
1851
1852
1853
1854
def filter[U](self, func: Callable[[T], bool | TypeIs[U]]) -> Iter[T] | Iter[U]:
    """Creates an `Iter` which uses a closure to determine if an element should be yielded.

    Given an element the closure must return true or false.

    The returned `Iter` will yield only the elements for which the closure returns true.

    The closure can return a `TypeIs` to narrow the type of the returned iterable.

    This won't have any runtime effect, but allows for better type inference.

    Note:
        `Iter.filter(f).next()` is equivalent to `Iter.find(f)`.

    Args:
        func (Callable[[T], bool | TypeIs[U]]): Function to evaluate each item.

    Returns:
        Iter[T] | Iter[U]: An iterable of the items that satisfy the predicate.

    Example:
    ```python
    >>> import pyochain as pc
    >>> data = (1, 2, 3)
    >>> pc.Iter(data).filter(lambda x: x > 1).collect()
    Seq(2, 3)
    >>> # See the equivalence of next and find:
    >>> pc.Iter(data).filter(lambda x: x > 1).next()
    Some(2)
    >>> pc.Iter(data).find(lambda x: x > 1)
    Some(2)
    >>> # Using TypeIs to narrow type:
    >>> from typing import TypeIs
    >>> def _is_str(x: object) -> TypeIs[str]:
    ...     return isinstance(x, str)
    >>> mixed_data = [1, "two", 3.0, "four"]
    >>> pc.Iter(mixed_data).filter(_is_str).collect()
    Seq('two', 'four')

    ```
    """
    return Iter(filter(func, self._inner))

filter_false(func)

filter_false(func: Callable[[T], TypeIs[U]]) -> Iter[U]
filter_false(func: Callable[[T], bool]) -> Iter[T]

Return elements for which func is False.

The func can return a TypeIs to narrow the type of the returned Iter.

This won't have any runtime effect, but allows for better type inference.

Parameters:

Name Type Description Default
func Callable[[T], bool | TypeIs[U]]

Function to evaluate each item.

required

Returns:

Type Description
Iter[T] | Iter[U]

Iter[T] | Iter[U]: An Iter of the items that do not satisfy the predicate.

Example:

>>> import pyochain as pc
>>> pc.Iter([1, 2, 3]).filter_false(lambda x: x > 1).collect()
Seq(1,)

Source code in src/pyochain/_iter.py
1940
1941
1942
1943
1944
1945
1946
1947
1948
1949
1950
1951
1952
1953
1954
1955
1956
1957
1958
1959
1960
1961
1962
1963
def filter_false[U](
    self, func: Callable[[T], bool | TypeIs[U]]
) -> Iter[T] | Iter[U]:
    """Return elements for which **func** is `False`.

    The **func** can return a `TypeIs` to narrow the type of the returned `Iter`.

    This won't have any runtime effect, but allows for better type inference.

    Args:
        func (Callable[[T], bool | TypeIs[U]]): Function to evaluate each item.

    Returns:
        Iter[T] | Iter[U]: An `Iter` of the items that do not satisfy the predicate.

    Example:
    ```python
    >>> import pyochain as pc
    >>> pc.Iter([1, 2, 3]).filter_false(lambda x: x > 1).collect()
    Seq(1,)

    ```
    """
    return Iter(itertools.filterfalse(func, self._inner))

filter_map(func)

Creates an iterator that both filters and maps.

The returned iterator yields only the values for which the supplied closure returns Some(value).

filter_map can be used to make chains of filter and map more concise.

The example below shows how a map().filter().map() can be shortened to a single call to filter_map.

Parameters:

Name Type Description Default
func Callable[[T], Option[R]]

Function to apply to each item.

required

Returns:

Type Description
Iter[R]

Iter[R]: An iterable of the results where func returned Some.

Example:

>>> import pyochain as pc
>>> def _parse(s: str) -> pc.Result[int, str]:
...     try:
...         return pc.Ok(int(s))
...     except ValueError:
...         return pc.Err(f"Invalid integer, got {s!r}")
>>>
>>> data = pc.Seq(["1", "two", "NaN", "four", "5"])
>>> data.iter().filter_map(lambda s: _parse(s).ok()).collect()
Seq(1, 5)
>>> # Equivalent to:
>>> (
...     data.iter()
...    .map(lambda s: _parse(s).ok())
...    .filter(lambda s: s.is_some())
...    .map(lambda s: s.unwrap())
...    .collect()
... )
Seq(1, 5)

Source code in src/pyochain/_iter.py
1965
1966
1967
1968
1969
1970
1971
1972
1973
1974
1975
1976
1977
1978
1979
1980
1981
1982
1983
1984
1985
1986
1987
1988
1989
1990
1991
1992
1993
1994
1995
1996
1997
1998
1999
2000
2001
2002
2003
2004
2005
2006
2007
2008
2009
2010
2011
def filter_map[R](self, func: Callable[[T], Option[R]]) -> Iter[R]:
    """Creates an iterator that both filters and maps.

    The returned iterator yields only the values for which the supplied closure returns Some(value).

    `filter_map` can be used to make chains of `filter` and map more concise.

    The example below shows how a `map().filter().map()` can be shortened to a single call to `filter_map`.

    Args:
        func (Callable[[T], Option[R]]): Function to apply to each item.

    Returns:
        Iter[R]: An iterable of the results where func returned `Some`.

    Example:
    ```python
    >>> import pyochain as pc
    >>> def _parse(s: str) -> pc.Result[int, str]:
    ...     try:
    ...         return pc.Ok(int(s))
    ...     except ValueError:
    ...         return pc.Err(f"Invalid integer, got {s!r}")
    >>>
    >>> data = pc.Seq(["1", "two", "NaN", "four", "5"])
    >>> data.iter().filter_map(lambda s: _parse(s).ok()).collect()
    Seq(1, 5)
    >>> # Equivalent to:
    >>> (
    ...     data.iter()
    ...    .map(lambda s: _parse(s).ok())
    ...    .filter(lambda s: s.is_some())
    ...    .map(lambda s: s.unwrap())
    ...    .collect()
    ... )
    Seq(1, 5)

    ```
    """

    def _filter_map(data: Iterable[T]) -> Iterator[R]:
        for item in data:
            res = func(item)
            if res.is_some():
                yield res.unwrap()

    return Iter(_filter_map(self._inner))

filter_map_star(func)

filter_map_star(
    func: Callable[[Any], Option[R]],
) -> Iter[R]
filter_map_star(
    func: Callable[[T1, T2], Option[R]],
) -> Iter[R]
filter_map_star(
    func: Callable[[T1, T2, T3], Option[R]],
) -> Iter[R]
filter_map_star(
    func: Callable[[T1, T2, T3, T4], Option[R]],
) -> Iter[R]
filter_map_star(
    func: Callable[[T1, T2, T3, T4, T5], Option[R]],
) -> Iter[R]
filter_map_star(
    func: Callable[[T1, T2, T3, T4, T5, T6], Option[R]],
) -> Iter[R]
filter_map_star(
    func: Callable[[T1, T2, T3, T4, T5, T6, T7], Option[R]],
) -> Iter[R]
filter_map_star(
    func: Callable[
        [T1, T2, T3, T4, T5, T6, T7, T8], Option[R]
    ],
) -> Iter[R]
filter_map_star(
    func: Callable[
        [T1, T2, T3, T4, T5, T6, T7, T8, T9], Option[R]
    ],
) -> Iter[R]
filter_map_star(
    func: Callable[
        [T1, T2, T3, T4, T5, T6, T7, T8, T9, T10], Option[R]
    ],
) -> Iter[R]

Creates an iterator that both filters and maps, where each element is an iterable.

Unlike .filter_map(), which passes each element as a single argument, .filter_map_star() unpacks each element into positional arguments for the function.

In short, for each element in the sequence, it computes func(*element).

This is useful after using methods like zip, product, or enumerate that yield tuples.

Parameters:

Name Type Description Default
func Callable[..., Option[R]]

Function to apply to unpacked elements.

required

Returns:

Type Description
Iter[R]

Iter[R]: An iterable of the results where func returned Some.

Example:

>>> import pyochain as pc
>>> data = pc.Seq([("1", "10"), ("two", "20"), ("3", "thirty")])
>>> def _parse_pair(s1: str, s2: str) -> pc.Result[tuple[int, int], str]:
...     try:
...         return pc.Ok((int(s1), int(s2)))
...     except ValueError:
...         return pc.Err(f"Invalid integer pair: {s1!r}, {s2!r}")
>>>
>>> data.iter().filter_map_star(lambda s1, s2: _parse_pair(s1, s2).ok()).collect()
Seq((1, 10),)

Source code in src/pyochain/_iter.py
2063
2064
2065
2066
2067
2068
2069
2070
2071
2072
2073
2074
2075
2076
2077
2078
2079
2080
2081
2082
2083
2084
2085
2086
2087
2088
2089
2090
2091
2092
2093
2094
2095
2096
2097
2098
2099
2100
2101
2102
2103
def filter_map_star[U: Iterable[Any], R](
    self: Iter[U],
    func: Callable[..., Option[R]],
) -> Iter[R]:
    """Creates an iterator that both filters and maps, where each element is an iterable.

    Unlike `.filter_map()`, which passes each element as a single argument, `.filter_map_star()` unpacks each element into positional arguments for the function.

    In short, for each `element` in the sequence, it computes `func(*element)`.

    This is useful after using methods like `zip`, `product`, or `enumerate` that yield tuples.

    Args:
        func (Callable[..., Option[R]]): Function to apply to unpacked elements.

    Returns:
        Iter[R]: An iterable of the results where func returned `Some`.

    Example:
    ```python
    >>> import pyochain as pc
    >>> data = pc.Seq([("1", "10"), ("two", "20"), ("3", "thirty")])
    >>> def _parse_pair(s1: str, s2: str) -> pc.Result[tuple[int, int], str]:
    ...     try:
    ...         return pc.Ok((int(s1), int(s2)))
    ...     except ValueError:
    ...         return pc.Err(f"Invalid integer pair: {s1!r}, {s2!r}")
    >>>
    >>> data.iter().filter_map_star(lambda s1, s2: _parse_pair(s1, s2).ok()).collect()
    Seq((1, 10),)

    ```
    """

    def _filter_map_star(data: Iterable[U]) -> Iterator[R]:
        for item in data:
            res = func(*item)
            if res.is_some():
                yield res.unwrap()

    return Iter(_filter_map_star(self._inner))

filter_star(func)

filter_star(
    func: Callable[[Any], bool],
) -> Iter[tuple[Any]]
filter_star(
    func: Callable[[T1, T2], bool],
) -> Iter[tuple[T1, T2]]
filter_star(
    func: Callable[[T1, T2, T3], bool],
) -> Iter[tuple[T1, T2, T3]]
filter_star(
    func: Callable[[T1, T2, T3, T4], bool],
) -> Iter[tuple[T1, T2, T3, T4]]
filter_star(
    func: Callable[[T1, T2, T3, T4, T5], bool],
) -> Iter[tuple[T1, T2, T3, T4, T5]]
filter_star(
    func: Callable[[T1, T2, T3, T4, T5, T6], bool],
) -> Iter[tuple[T1, T2, T3, T4, T5, T6]]
filter_star(
    func: Callable[[T1, T2, T3, T4, T5, T6, T7], bool],
) -> Iter[tuple[T1, T2, T3, T4, T5, T6, T7]]
filter_star(
    func: Callable[[T1, T2, T3, T4, T5, T6, T7, T8], bool],
) -> Iter[tuple[T1, T2, T3, T4, T5, T6, T7, T8]]
filter_star(
    func: Callable[
        [T1, T2, T3, T4, T5, T6, T7, T8, T9], bool
    ],
) -> Iter[tuple[T1, T2, T3, T4, T5, T6, T7, T8, T9]]
filter_star(
    func: Callable[
        [T1, T2, T3, T4, T5, T6, T7, T8, T9, T10], bool
    ],
) -> Iter[tuple[T1, T2, T3, T4, T5, T6, T7, T8, T9, T10]]

Creates an Iter which uses a closure func to determine if an element should be yielded, where each element is an iterable.

Unlike .filter(), which passes each element as a single argument, .filter_star() unpacks each element into positional arguments for the func.

In short, for each element in the Iter, it computes func(*element).

This is useful after using methods like .zip(), .product(), or .enumerate() that yield tuples.

Parameters:

Name Type Description Default
func Callable[..., bool]

Function to evaluate unpacked elements.

required

Returns:

Type Description
Iter[U]

Iter[U]: An Iter of the items that satisfy the predicate.

Example:

>>> import pyochain as pc
>>> data = pc.Seq(["apple", "banana", "cherry", "date"])
>>> data.iter().enumerate().filter_star(lambda index, fruit: index % 2 == 0).map_star(lambda index, fruit: fruit.title()).collect()
Seq('Apple', 'Cherry')

Source code in src/pyochain/_iter.py
1907
1908
1909
1910
1911
1912
1913
1914
1915
1916
1917
1918
1919
1920
1921
1922
1923
1924
1925
1926
1927
1928
1929
1930
1931
1932
1933
1934
def filter_star[U: Iterable[Any]](
    self: Iter[U],
    func: Callable[..., bool],
) -> Iter[U]:
    """Creates an `Iter` which uses a closure **func** to determine if an element should be yielded, where each element is an iterable.

    Unlike `.filter()`, which passes each element as a single argument, `.filter_star()` unpacks each element into positional arguments for the **func**.

    In short, for each element in the `Iter`, it computes `func(*element)`.

    This is useful after using methods like `.zip()`, `.product()`, or `.enumerate()` that yield tuples.

    Args:
        func (Callable[..., bool]): Function to evaluate unpacked elements.

    Returns:
        Iter[U]: An `Iter` of the items that satisfy the predicate.

    Example:
    ```python
    >>> import pyochain as pc
    >>> data = pc.Seq(["apple", "banana", "cherry", "date"])
    >>> data.iter().enumerate().filter_star(lambda index, fruit: index % 2 == 0).map_star(lambda index, fruit: fruit.title()).collect()
    Seq('Apple', 'Cherry')

    ```
    """
    return Iter(filter(lambda x: func(*x), self._inner))

find_map(func)

Applies function to the elements of the Iterator and returns the first Some(R) result.

Iter.find_map(f) is equivalent to Iter.filter_map(f).next().

Parameters:

Name Type Description Default
func Callable[[T], Option[R]]

Function to apply to each element, returning an Option[R].

required

Returns:

Type Description
Option[R]

Option[R]: The first Some(R) result from applying func, or NONE if no such result is found.

Example:

>>> import pyochain as pc
>>> def _parse(s: str) -> pc.Option[int]:
...     try:
...         return pc.Some(int(s))
...     except ValueError:
...         return pc.NONE
>>>
>>> pc.Iter(["lol", "NaN", "2", "5"]).find_map(_parse)
Some(2)

Source code in src/pyochain/_iter.py
1527
1528
1529
1530
1531
1532
1533
1534
1535
1536
1537
1538
1539
1540
1541
1542
1543
1544
1545
1546
1547
1548
1549
1550
1551
1552
def find_map[R](self, func: Callable[[T], Option[R]]) -> Option[R]:
    """Applies function to the elements of the `Iterator` and returns the first Some(R) result.

    `Iter.find_map(f)` is equivalent to `Iter.filter_map(f).next()`.

    Args:
        func (Callable[[T], Option[R]]): Function to apply to each element, returning an `Option[R]`.

    Returns:
        Option[R]: The first `Some(R)` result from applying `func`, or `NONE` if no such result is found.

    Example:
    ```python
    >>> import pyochain as pc
    >>> def _parse(s: str) -> pc.Option[int]:
    ...     try:
    ...         return pc.Some(int(s))
    ...     except ValueError:
    ...         return pc.NONE
    >>>
    >>> pc.Iter(["lol", "NaN", "2", "5"]).find_map(_parse)
    Some(2)

    ```
    """
    return self.filter_map(func).next()

flat_map(func)

Creates an iterator that applies a function to each element of the original iterator and flattens the result.

This is useful when the func you want to pass to .map() itself returns an iterable, and you want to avoid having nested iterables in the output.

This is equivalent to calling .map(func).flatten().

Parameters:

Name Type Description Default
func Callable[[T], Iterable[R]]

Function to apply to each element.

required

Returns:

Type Description
Iter[R]

Iter[R]: An iterable of flattened transformed elements.

Example:

>>> import pyochain as pc
>>> pc.Iter([1, 2, 3]).flat_map(lambda x: range(x)).collect()
Seq(0, 0, 1, 0, 1, 2)

Source code in src/pyochain/_iter.py
1146
1147
1148
1149
1150
1151
1152
1153
1154
1155
1156
1157
1158
1159
1160
1161
1162
1163
1164
1165
1166
1167
def flat_map[R](self, func: Callable[[T], Iterable[R]]) -> Iter[R]:
    """Creates an iterator that applies a function to each element of the original iterator and flattens the result.

    This is useful when the **func** you want to pass to `.map()` itself returns an iterable, and you want to avoid having nested iterables in the output.

    This is equivalent to calling `.map(func).flatten()`.

    Args:
        func (Callable[[T], Iterable[R]]): Function to apply to each element.

    Returns:
        Iter[R]: An iterable of flattened transformed elements.

    Example:
    ```python
    >>> import pyochain as pc
    >>> pc.Iter([1, 2, 3]).flat_map(lambda x: range(x)).collect()
    Seq(0, 0, 1, 0, 1, 2)

    ```
    """
    return Iter(itertools.chain.from_iterable(map(func, self._inner)))

flatten()

flatten() -> Iter[U]
flatten() -> Iter[U]
flatten() -> Iter[U]
flatten() -> Iter[U]
flatten() -> Iter[U]
flatten() -> Iter[U]
flatten() -> Iter[U]
flatten() -> Iter[U]
flatten() -> Iter[U]
flatten() -> Iter[U]
flatten() -> Iter[U]
flatten() -> Iter[U]
flatten() -> Iter[U]
flatten() -> Iter[U]
flatten() -> Iter[int]

Creates an Iter that flattens nested structure.

Returns:

Type Description
Iter[Any]

Iter[Any]: An Iter of flattened elements.

This is useful when you have an Iter of Iterable and you want to remove one level of indirection.

Examples: Basic usage:

>>> import pyochain as pc
>>> data = [[1, 2, 3, 4], [5, 6]]
>>> flattened = pc.Iter(data).flatten().collect()
>>> flattened
Seq(1, 2, 3, 4, 5, 6)
Mapping and then flattening:
>>> import pyochain as pc
>>> words = pc.Iter(["alpha", "beta", "gamma"])
>>> merged = words.flatten().collect()
>>> merged
Seq('a', 'l', 'p', 'h', 'a', 'b', 'e', 't', 'a', 'g', 'a', 'm', 'm', 'a')
Flattening only removes one level of nesting at a time:
>>> import pyochain as pc
>>> d3 = [[[1, 2], [3, 4]], [[5, 6], [7, 8]]]
>>> d2 = pc.Iter(d3).flatten().collect()
>>> d2
Seq([1, 2], [3, 4], [5, 6], [7, 8])
>>> d1 = pc.Iter(d3).flatten().flatten().collect()
>>> d1
Seq(1, 2, 3, 4, 5, 6, 7, 8)
Here we see that flatten() does not perform a “deep” flatten.

Instead, only one level of nesting is removed.

That is, if you flatten() a three-dimensional array, the result will be two-dimensional and not one-dimensional.

To get a one-dimensional structure, you have to flatten() again.

Source code in src/pyochain/_iter.py
1096
1097
1098
1099
1100
1101
1102
1103
1104
1105
1106
1107
1108
1109
1110
1111
1112
1113
1114
1115
1116
1117
1118
1119
1120
1121
1122
1123
1124
1125
1126
1127
1128
1129
1130
1131
1132
1133
1134
1135
1136
1137
1138
1139
1140
1141
1142
1143
1144
def flatten[U: Iterable[Any]](self: Iter[U]) -> Iter[Any]:
    """Creates an `Iter` that flattens nested structure.

    Returns:
        Iter[Any]: An `Iter` of flattened elements.

    This is useful when you have an `Iter` of `Iterable` and you want to remove one level of indirection.

    Examples:
    Basic usage:
    ```python
    >>> import pyochain as pc
    >>> data = [[1, 2, 3, 4], [5, 6]]
    >>> flattened = pc.Iter(data).flatten().collect()
    >>> flattened
    Seq(1, 2, 3, 4, 5, 6)

    ```
    Mapping and then flattening:
    ```python
    >>> import pyochain as pc
    >>> words = pc.Iter(["alpha", "beta", "gamma"])
    >>> merged = words.flatten().collect()
    >>> merged
    Seq('a', 'l', 'p', 'h', 'a', 'b', 'e', 't', 'a', 'g', 'a', 'm', 'm', 'a')

    ```
    Flattening only removes one level of nesting at a time:
    ```python
    >>> import pyochain as pc
    >>> d3 = [[[1, 2], [3, 4]], [[5, 6], [7, 8]]]
    >>> d2 = pc.Iter(d3).flatten().collect()
    >>> d2
    Seq([1, 2], [3, 4], [5, 6], [7, 8])
    >>> d1 = pc.Iter(d3).flatten().flatten().collect()
    >>> d1
    Seq(1, 2, 3, 4, 5, 6, 7, 8)

    ```
    Here we see that `flatten()` does not perform a “deep” flatten.

    Instead, only **one** level of nesting is removed.

    That is, if you `flatten()` a three-dimensional array, the result will be two-dimensional and not one-dimensional.

    To get a one-dimensional structure, you have to `flatten()` again.

    """
    return Iter(itertools.chain.from_iterable(self._inner))

from_count(start=0, step=1) staticmethod

Create an infinite Iterator of evenly spaced values.

Warning

This creates an infinite iterator.

Be sure to use Iter.take() or Iter.slice() to limit the number of items taken.

Parameters:

Name Type Description Default
start int

Starting value of the sequence.

0
step int

Difference between consecutive values.

1

Returns:

Type Description
Iter[int]

Iter[int]: An iterator generating the sequence.

Example:

>>> import pyochain as pc
>>> pc.Iter.from_count(10, 2).take(3).collect()
Seq(10, 12, 14)

Source code in src/pyochain/_iter.py
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
@staticmethod
def from_count(start: int = 0, step: int = 1) -> Iter[int]:
    """Create an infinite `Iterator` of evenly spaced values.

    Warning:
        This creates an infinite iterator.

        Be sure to use `Iter.take()` or `Iter.slice()` to limit the number of items taken.

    Args:
        start (int): Starting value of the sequence.
        step (int): Difference between consecutive values.

    Returns:
        Iter[int]: An iterator generating the sequence.

    Example:
    ```python
    >>> import pyochain as pc
    >>> pc.Iter.from_count(10, 2).take(3).collect()
    Seq(10, 12, 14)

    ```
    """
    return Iter(itertools.count(start, step))

from_fn(f) staticmethod

Create an Iter from a nullary generator function.

The callable must return:

  • Some(value) to yield a value
  • NONE to stop

Parameters:

Name Type Description Default
f Callable[[], Option[R]]

Callable that returns the next item wrapped in Option.

required

Returns:

Type Description
Iter[R]

Iter[R]: An iterator yielding values produced by f.

Example:

>>> import pyochain as pc
>>> counter = 0
>>> def gen() -> pc.Option[int]:
...     global counter
...     counter += 1
...     return pc.Some(counter) if counter < 6 else pc.NONE
>>> pc.Iter.from_fn(gen).collect()
Seq(1, 2, 3, 4, 5)

Source code in src/pyochain/_iter.py
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
@staticmethod
def from_fn[R](f: Callable[[], Option[R]]) -> Iter[R]:
    """Create an `Iter` from a nullary generator function.

    The callable must return:

    - `Some(value)` to yield a value
    - `NONE` to stop


    Args:
        f (Callable[[], Option[R]]): Callable that returns the next item wrapped in `Option`.

    Returns:
        Iter[R]: An iterator yielding values produced by **f**.

    Example:
    ```python
    >>> import pyochain as pc
    >>> counter = 0
    >>> def gen() -> pc.Option[int]:
    ...     global counter
    ...     counter += 1
    ...     return pc.Some(counter) if counter < 6 else pc.NONE
    >>> pc.Iter.from_fn(gen).collect()
    Seq(1, 2, 3, 4, 5)

    ```
    """

    def _from_fn() -> Iterator[R]:
        while True:
            item = f()
            if item.is_none():
                return
            yield item.unwrap()

    return Iter(_from_fn())

from_ref(other) classmethod

Create an independent lazy copy from another Iter.

Both the original and the returned Iter can be consumed independently, in a lazy manner.

Note

Values consumed by one iterator remain in the shared buffer until the other iterator consumes them too.

This is the unavoidable cost of having two independent iterators over the same source.

However, once both iterators have passed a value, it's freed from memory.

See Also
  • Iter.cloned() which is the instance method version of this function.

Parameters:

Name Type Description Default
other Self

An Iter instance to copy.

required

Returns:

Name Type Description
Self Self

A new Iter instance that is independent from the original.

Example:

>>> import pyochain as pc
>>> original = pc.Iter([1, 2, 3])
>>> copy = pc.Iter.from_ref(original)
>>> copy.map(lambda x: x * 2).collect()
Seq(2, 4, 6)
>>> original.next()
Some(1)

Source code in src/pyochain/_iter.py
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
@classmethod
def from_ref(cls, other: Self) -> Self:
    """Create an independent lazy copy from another `Iter`.

    Both the original and the returned `Iter` can be consumed independently, in a lazy manner.

    Note:
        Values consumed by one iterator remain in the shared buffer until the other iterator consumes them too.

        This is the unavoidable cost of having two independent iterators over the same source.

        However, once both iterators have passed a value, it's freed from memory.

    See Also:
        - `Iter.cloned()` which is the instance method version of this function.

    Args:
        other (Self): An `Iter` instance to copy.

    Returns:
        Self: A new `Iter` instance that is independent from the original.

    Example:
    ```python
    >>> import pyochain as pc
    >>> original = pc.Iter([1, 2, 3])
    >>> copy = pc.Iter.from_ref(original)
    >>> copy.map(lambda x: x * 2).collect()
    Seq(2, 4, 6)
    >>> original.next()
    Some(1)

    ```
    """
    it1, it2 = itertools.tee(other._inner)
    other._inner = it1
    return cls(it2)

group_by(key=None)

group_by(key: None = None) -> Iter[tuple[T, Self]]
group_by(key: Callable[[T], K]) -> Iter[tuple[K, Self]]
group_by(
    key: Callable[[T], K] | None = None,
) -> Iter[tuple[K, Self] | tuple[T, Self]]

Make an Iter that returns consecutive keys and groups from the iterable.

Parameters:

Name Type Description Default
key Callable[[T], Any] | None

Function computing a key value for each element..

None

If not specified or is None, key defaults to an identity function and returns the element unchanged.

Returns:

Type Description
Iter[tuple[Any | T, Self]]

Iter[tuple[Any | T, Self]]: An Iter of (key, value) tuples.

The values yielded are (K, Self) tuples, where the first element is the group key and the second element is an Iter of type T over the group values.

The Iter needs to already be sorted on the same key function.

This is due to the fact that it generates a new Group every time the value of the key function changes.

That behavior differs from SQL's GROUP BY which aggregates common elements regardless of their input order.

Warning

You must materialize the second element of the tuple immediately when iterating over groups.

Because .group_by() uses Python's itertools.groupby under the hood, each group's iterator shares internal state.

When you advance to the next group, the previous group's iterator becomes invalid and will yield empty results.

Example:

>>> import pyochain as pc
>>> # Example 1: Group even and odd numbers
>>> (
... pc.Iter.from_count() # create an infinite iterator of integers
... .take(8) # take the first 8
... .map(lambda x: (x % 2 == 0, x)) # map to (is_even, value)
... .sort(key=lambda x: x[0]) # sort by is_even
... .iter() # Since sort collect to a Vec, we need to convert back to Iter
... .group_by(lambda x: x[0]) # group by is_even
... .map_star(lambda g, vals: (g, vals.map_star(lambda _, y: y).into(list))) # extract values from groups, discarding keys, and materializing them to lists
... .collect() # collect the result
... .into(dict) # convert to dict
... )
{False: [1, 3, 5, 7], True: [0, 2, 4, 6]}
>>> # Example 2: Group by a common key, already sorted
>>> data = [
...     {"name": "Alice", "gender": "F"},
...     {"name": "Bob", "gender": "M"},
...     {"name": "Charlie", "gender": "M"},
...     {"name": "Dan", "gender": "M"},
... ]
>>> (
... pc.Iter(data)
... .group_by(lambda x: x["gender"]) # group by the gender key
... .map_star(lambda g, vals: (g, vals.length())) # get the length of each group
... .collect()
... )
Seq(('F', 1), ('M', 3))
>>> # Example 3: Incorrect usage with LATE materialization:
>>> groups = pc.Iter(["a1", "a2", "b1"]).group_by(lambda x: x[0]).collect()
>>> # Now iterate - TOO LATE! The group iterators are consumed
>>> for g in groups:
...     print(g[1].collect())  # ❌ Empty!
Seq()
Seq()
>>> # Example 4: Correct usage with intermediate materialization:
>>> groups = (
...     pc.Iter(["a1", "a2", "b1"])
...     .group_by(lambda x: x[0])
...     .map_star(lambda g, vals: (g, vals.collect()))  # ✅ Materialize NOW
...     .collect()
...     .iter()
...     .for_each(lambda x: print(f"{x[0]}: {x[1]}"))
... )
a: Seq('a1', 'a2')
b: Seq('b1',)

Source code in src/pyochain/_iter.py
3045
3046
3047
3048
3049
3050
3051
3052
3053
3054
3055
3056
3057
3058
3059
3060
3061
3062
3063
3064
3065
3066
3067
3068
3069
3070
3071
3072
3073
3074
3075
3076
3077
3078
3079
3080
3081
3082
3083
3084
3085
3086
3087
3088
3089
3090
3091
3092
3093
3094
3095
3096
3097
3098
3099
3100
3101
3102
3103
3104
3105
3106
3107
3108
3109
3110
3111
3112
3113
3114
3115
3116
3117
3118
3119
3120
3121
3122
3123
3124
def group_by(
    self, key: Callable[[T], Any] | None = None
) -> Iter[tuple[Any | T, Self]]:
    """Make an `Iter` that returns consecutive keys and groups from the iterable.

    Args:
        key (Callable[[T], Any] | None): Function computing a key value for each element..
    If not specified or is None, **key** defaults to an identity function and returns the element unchanged.

    Returns:
        Iter[tuple[Any | T, Self]]: An `Iter` of `(key, value)` tuples.

    The values yielded are `(K, Self)` tuples, where the first element is the group key and the second element is an `Iter` of type `T` over the group values.

    The `Iter` needs to already be sorted on the same key function.

    This is due to the fact that it generates a new `Group` every time the value of the **key** function changes.

    That behavior differs from SQL's `GROUP BY` which aggregates common elements regardless of their input order.

    Warning:
        You must materialize the second element of the tuple immediately when iterating over groups.

        Because `.group_by()` uses Python's `itertools.groupby` under the hood, each group's iterator shares internal state.

        When you advance to the next group, the previous group's iterator becomes invalid and will yield empty results.

    Example:
    ```python
    >>> import pyochain as pc
    >>> # Example 1: Group even and odd numbers
    >>> (
    ... pc.Iter.from_count() # create an infinite iterator of integers
    ... .take(8) # take the first 8
    ... .map(lambda x: (x % 2 == 0, x)) # map to (is_even, value)
    ... .sort(key=lambda x: x[0]) # sort by is_even
    ... .iter() # Since sort collect to a Vec, we need to convert back to Iter
    ... .group_by(lambda x: x[0]) # group by is_even
    ... .map_star(lambda g, vals: (g, vals.map_star(lambda _, y: y).into(list))) # extract values from groups, discarding keys, and materializing them to lists
    ... .collect() # collect the result
    ... .into(dict) # convert to dict
    ... )
    {False: [1, 3, 5, 7], True: [0, 2, 4, 6]}
    >>> # Example 2: Group by a common key, already sorted
    >>> data = [
    ...     {"name": "Alice", "gender": "F"},
    ...     {"name": "Bob", "gender": "M"},
    ...     {"name": "Charlie", "gender": "M"},
    ...     {"name": "Dan", "gender": "M"},
    ... ]
    >>> (
    ... pc.Iter(data)
    ... .group_by(lambda x: x["gender"]) # group by the gender key
    ... .map_star(lambda g, vals: (g, vals.length())) # get the length of each group
    ... .collect()
    ... )
    Seq(('F', 1), ('M', 3))
    >>> # Example 3: Incorrect usage with LATE materialization:
    >>> groups = pc.Iter(["a1", "a2", "b1"]).group_by(lambda x: x[0]).collect()
    >>> # Now iterate - TOO LATE! The group iterators are consumed
    >>> for g in groups:
    ...     print(g[1].collect())  # ❌ Empty!
    Seq()
    Seq()
    >>> # Example 4: Correct usage with intermediate materialization:
    >>> groups = (
    ...     pc.Iter(["a1", "a2", "b1"])
    ...     .group_by(lambda x: x[0])
    ...     .map_star(lambda g, vals: (g, vals.collect()))  # ✅ Materialize NOW
    ...     .collect()
    ...     .iter()
    ...     .for_each(lambda x: print(f"{x[0]}: {x[1]}"))
    ... )
    a: Seq('a1', 'a2')
    b: Seq('b1',)

    ```
    """
    new = self.__class__
    return Iter((x, new(y)) for x, y in itertools.groupby(self._inner, key))

is_strictly_n(n)

YieldOk[T] as long as the iterable has exactly n items.

If it has fewer than n items, yield Err[ValueError] with the actual number of items.

If it has more than n items, yield Err[ValueError] with the number n + 1.

Note that the returned iterable must be consumed in order for the check to be made.

Parameters:

Name Type Description Default
n int

The exact number of items expected.

required

Returns:

Type Description
Iter[Result[T, ValueError]]

Iter[Result[T, ValueError]]: A new Iterable wrapper yielding results based on the item count.

Example:

>>> import pyochain as pc
>>> data = ["a", "b", "c", "d"]
>>> n = 4
>>> pc.Iter(data).is_strictly_n(n).collect()
Seq(Ok('a'), Ok('b'), Ok('c'), Ok('d'))
>>> pc.Iter("ab").is_strictly_n(3).collect()  # doctest: +NORMALIZE_WHITESPACE
Seq(Ok('a'), Ok('b'),
Err(ValueError('Too few items in iterable (got 2)')))
>>> pc.Iter("abc").is_strictly_n(2).collect()  # doctest: +NORMALIZE_WHITESPACE
Seq(Ok('a'), Ok('b'),
Err(ValueError('Too many items in iterable (got at least 3)')))
You can easily combine this with .map(lambda r: r.map_err(...)) to handle the errors as you wish.
>>> def _my_err(e: ValueError) -> str:
...     return f"custom error: {e}"
>>>
>>> pc.Iter([1]).is_strictly_n(0).map(lambda r: r.map_err(_my_err)).collect()
Seq(Err('custom error: Too many items in iterable (got at least 1)'),)
Or use .filter_map(...) to only keep the Ok values.
>>> pc.Iter([1, 2, 3]).is_strictly_n(2).filter_map(lambda r: r.ok()).collect()
Seq(1, 2)

Source code in src/pyochain/_iter.py
2755
2756
2757
2758
2759
2760
2761
2762
2763
2764
2765
2766
2767
2768
2769
2770
2771
2772
2773
2774
2775
2776
2777
2778
2779
2780
2781
2782
2783
2784
2785
2786
2787
2788
2789
2790
2791
2792
2793
2794
2795
2796
2797
2798
2799
2800
2801
2802
2803
2804
2805
2806
2807
2808
2809
2810
2811
2812
2813
2814
2815
2816
2817
def is_strictly_n(self, n: int) -> Iter[Result[T, ValueError]]:
    """Yield`Ok[T]` as long as the iterable has exactly *n* items.

    If it has fewer than *n* items, yield `Err[ValueError]` with the actual number of items.

    If it has more than *n* items, yield `Err[ValueError]` with the number `n + 1`.

    Note that the returned iterable must be consumed in order for the check to
    be made.

    Args:
        n (int): The exact number of items expected.

    Returns:
        Iter[Result[T, ValueError]]: A new Iterable wrapper yielding results based on the item count.

    Example:
    ```python
    >>> import pyochain as pc
    >>> data = ["a", "b", "c", "d"]
    >>> n = 4
    >>> pc.Iter(data).is_strictly_n(n).collect()
    Seq(Ok('a'), Ok('b'), Ok('c'), Ok('d'))
    >>> pc.Iter("ab").is_strictly_n(3).collect()  # doctest: +NORMALIZE_WHITESPACE
    Seq(Ok('a'), Ok('b'),
    Err(ValueError('Too few items in iterable (got 2)')))
    >>> pc.Iter("abc").is_strictly_n(2).collect()  # doctest: +NORMALIZE_WHITESPACE
    Seq(Ok('a'), Ok('b'),
    Err(ValueError('Too many items in iterable (got at least 3)')))

    ```
    You can easily combine this with `.map(lambda r: r.map_err(...))` to handle the errors as you wish.
    ```python
    >>> def _my_err(e: ValueError) -> str:
    ...     return f"custom error: {e}"
    >>>
    >>> pc.Iter([1]).is_strictly_n(0).map(lambda r: r.map_err(_my_err)).collect()
    Seq(Err('custom error: Too many items in iterable (got at least 1)'),)

    ```
    Or use `.filter_map(...)` to only keep the `Ok` values.
    ```python
    >>> pc.Iter([1, 2, 3]).is_strictly_n(2).filter_map(lambda r: r.ok()).collect()
    Seq(1, 2)

    ```
    """

    def _strictly_n_(data: Iterator[T]) -> Iterator[Result[T, ValueError]]:
        sent = 0
        for item in itertools.islice(data, n):
            yield Ok(item)
            sent += 1

        if sent < n:
            e = ValueError(f"Too few items in iterable (got {sent})")
            yield Err(e)

        for _ in data:
            e = ValueError(f"Too many items in iterable (got at least {n + 1})")
            yield Err(e)

    return Iter(_strictly_n_(self._inner))

map(func)

Apply a function func to each element of the Iter.

If you are good at thinking in types, you can think of Iter.map() like this:

  • You have an Iterator that gives you elements of some type A
  • You want an Iterator of some other type B
  • Thenyou can use .map(), passing a closure func that takes an A and returns a B.

Iter.map() is conceptually similar to a for loop.

However, as Iter.map() is lazy, it is best used when you are already working with other Iter instances.

If you are doing some sort of looping for a side effect, it is considered more idiomatic to use Iter.for_each() than Iter.map().collect().

Parameters:

Name Type Description Default
func Callable[[T], R]

Function to apply to each element.

required

Returns:

Type Description
Iter[R]

Iter[R]: An iterator of transformed elements.

Example:

>>> import pyochain as pc
>>> pc.Iter([1, 2]).map(lambda x: x + 1).collect()
Seq(2, 3)
>>> # You can use methods on the class rather than on instance for convenience:
>>> pc.Iter(["a", "b", "c"]).map(str.upper).collect()
Seq('A', 'B', 'C')
>>> pc.Iter(["a", "b", "c"]).map(lambda s: s.upper()).collect()
Seq('A', 'B', 'C')

Source code in src/pyochain/_iter.py
1556
1557
1558
1559
1560
1561
1562
1563
1564
1565
1566
1567
1568
1569
1570
1571
1572
1573
1574
1575
1576
1577
1578
1579
1580
1581
1582
1583
1584
1585
1586
1587
1588
1589
1590
def map[R](self, func: Callable[[T], R]) -> Iter[R]:
    """Apply a function **func** to each element of the `Iter`.

    If you are good at thinking in types, you can think of `Iter.map()` like this:

    - You have an `Iterator` that gives you elements of some type `A`
    - You want an `Iterator` of some other type `B`
    - Thenyou can use `.map()`, passing a closure **func** that takes an `A` and returns a `B`.

    `Iter.map()` is conceptually similar to a for loop.

    However, as `Iter.map()` is lazy, it is best used when you are already working with other `Iter` instances.

    If you are doing some sort of looping for a side effect, it is considered more idiomatic to use `Iter.for_each()` than `Iter.map().collect()`.

    Args:
        func (Callable[[T], R]): Function to apply to each element.

    Returns:
        Iter[R]: An iterator of transformed elements.

    Example:
    ```python
    >>> import pyochain as pc
    >>> pc.Iter([1, 2]).map(lambda x: x + 1).collect()
    Seq(2, 3)
    >>> # You can use methods on the class rather than on instance for convenience:
    >>> pc.Iter(["a", "b", "c"]).map(str.upper).collect()
    Seq('A', 'B', 'C')
    >>> pc.Iter(["a", "b", "c"]).map(lambda s: s.upper()).collect()
    Seq('A', 'B', 'C')

    ```
    """
    return Iter(map(func, self._inner))

map_juxt(*funcs)

map_juxt(
    func1: Callable[[T], R1], func2: Callable[[T], R2]
) -> Iter[tuple[R1, R2]]
map_juxt(
    func1: Callable[[T], R1],
    func2: Callable[[T], R2],
    func3: Callable[[T], R3],
) -> Iter[tuple[R1, R2, R3]]
map_juxt(
    func1: Callable[[T], R1],
    func2: Callable[[T], R2],
    func3: Callable[[T], R3],
    func4: Callable[[T], R4],
) -> Iter[tuple[R1, R2, R3, R4]]

Apply several functions to each item.

Returns a new Iter where each item is a tuple of the results of applying each function to the original item.

Parameters:

Name Type Description Default
*funcs Callable[[T], object]

Functions to apply to each item.

()

Returns:

Type Description
Iter[tuple[object, ...]]

Iter[tuple[object, ...]]: An iterable of tuples containing the results of each function.

>>> import pyochain as pc
>>> def is_even(n: int) -> bool:
...     return n % 2 == 0
>>> def is_positive(n: int) -> bool:
...     return n > 0
>>>
>>> pc.Iter([1, -2, 3]).map_juxt(is_even, is_positive).collect()
Seq((False, True), (True, False), (False, True))
Source code in src/pyochain/_iter.py
2973
2974
2975
2976
2977
2978
2979
2980
2981
2982
2983
2984
2985
2986
2987
2988
2989
2990
2991
2992
2993
2994
2995
def map_juxt(self, *funcs: Callable[[T], object]) -> Iter[tuple[object, ...]]:
    """Apply several functions to each item.

    Returns a new Iter where each item is a tuple of the results of applying each function to the original item.

    Args:
        *funcs (Callable[[T], object]): Functions to apply to each item.

    Returns:
        Iter[tuple[object, ...]]: An iterable of tuples containing the results of each function.
    ```python
    >>> import pyochain as pc
    >>> def is_even(n: int) -> bool:
    ...     return n % 2 == 0
    >>> def is_positive(n: int) -> bool:
    ...     return n > 0
    >>>
    >>> pc.Iter([1, -2, 3]).map_juxt(is_even, is_positive).collect()
    Seq((False, True), (True, False), (False, True))

    ```
    """
    return Iter(map(cz.functoolz.juxt(*funcs), self._inner))

map_star(func)

map_star(func: Callable[[Any], R]) -> Iter[R]
map_star(func: Callable[[T1, T2], R]) -> Iter[R]
map_star(func: Callable[[T1, T2, T3], R]) -> Iter[R]
map_star(func: Callable[[T1, T2, T3, T4], R]) -> Iter[R]
map_star(
    func: Callable[[T1, T2, T3, T4, T5], R],
) -> Iter[R]
map_star(
    func: Callable[[T1, T2, T3, T4, T5, T6], R],
) -> Iter[R]
map_star(
    func: Callable[[T1, T2, T3, T4, T5, T6, T7], R],
) -> Iter[R]
map_star(
    func: Callable[[T1, T2, T3, T4, T5, T6, T7, T8], R],
) -> Iter[R]
map_star(
    func: Callable[[T1, T2, T3, T4, T5, T6, T7, T8, T9], R],
) -> Iter[R]
map_star(
    func: Callable[
        [T1, T2, T3, T4, T5, T6, T7, T8, T9, T10], R
    ],
) -> Iter[R]

Applies a function to each element.where each element is an iterable.

Unlike .map(), which passes each element as a single argument, .starmap() unpacks each element into positional arguments for the function.

In short, for each element in the Iter, it computes func(*element).

Note

Always prefer using .map_star() over .map() when working with Iter of tuple elements. Not only it is more readable, but it's also much more performant (up to 30% faster in benchmarks).

Parameters:

Name Type Description Default
func Callable[..., R]

Function to apply to unpacked elements.

required

Returns:

Type Description
Iter[R]

Iter[R]: An iterable of results from applying the function to unpacked elements.

Example:

>>> import pyochain as pc
>>> def make_sku(color: str, size: str) -> str:
...     return f"{color}-{size}"
>>> data = pc.Seq(["blue", "red"])
>>> data.iter().product(["S", "M"]).map_star(make_sku).collect()
Seq('blue-S', 'blue-M', 'red-S', 'red-M')
>>> # This is equivalent to:
>>> data.iter().product(["S", "M"]).map(lambda x: make_sku(*x)).collect()
Seq('blue-S', 'blue-M', 'red-S', 'red-M')

Source code in src/pyochain/_iter.py
1642
1643
1644
1645
1646
1647
1648
1649
1650
1651
1652
1653
1654
1655
1656
1657
1658
1659
1660
1661
1662
1663
1664
1665
1666
1667
1668
1669
1670
1671
1672
1673
1674
1675
1676
def map_star[U: Iterable[Any], R](
    self: Iter[U],
    func: Callable[..., R],
) -> Iter[R]:
    """Applies a function to each element.where each element is an iterable.

    Unlike `.map()`, which passes each element as a single argument, `.starmap()` unpacks each element into positional arguments for the function.

    In short, for each element in the `Iter`, it computes `func(*element)`.

    Note:
        Always prefer using `.map_star()` over `.map()` when working with `Iter` of `tuple` elements.
        Not only it is more readable, but it's also much more performant (up to 30% faster in benchmarks).

    Args:
        func (Callable[..., R]): Function to apply to unpacked elements.

    Returns:
        Iter[R]: An iterable of results from applying the function to unpacked elements.

    Example:
    ```python
    >>> import pyochain as pc
    >>> def make_sku(color: str, size: str) -> str:
    ...     return f"{color}-{size}"
    >>> data = pc.Seq(["blue", "red"])
    >>> data.iter().product(["S", "M"]).map_star(make_sku).collect()
    Seq('blue-S', 'blue-M', 'red-S', 'red-M')
    >>> # This is equivalent to:
    >>> data.iter().product(["S", "M"]).map(lambda x: make_sku(*x)).collect()
    Seq('blue-S', 'blue-M', 'red-S', 'red-M')

    ```
    """
    return Iter(itertools.starmap(func, self._inner))

map_while(func)

Creates an iterator that both yields elements based on a predicate and maps.

map_while() takes a closure as an argument. It will call this closure on each element of the iterator, and yield elements while it returns Some(_).

After NONE is returned, map_while() stops and the rest of the elements are ignored.

Parameters:

Name Type Description Default
func Callable[[T], Option[R]]

Function to apply to each element that returns Option[R].

required

Returns:

Type Description
Iter[R]

Iter[R]: An iterator of transformed elements until NONE is encountered.

Example:

>>> import pyochain as pc
>>> def checked_div(x: int) -> pc.Option[int]:
...     return pc.Some(16 // x) if x != 0 else pc.NONE
>>>
>>> data = pc.Iter([-1, 4, 0, 1])
>>> data.map_while(checked_div).collect()
Seq(-16, 4)
>>> data = pc.Iter([0, 1, 2, -3, 4, 5, -6])
>>> # Convert to positive ints, stop at first negative
>>> data.map_while(lambda x: pc.Some(x) if x >= 0 else pc.NONE).collect()
Seq(0, 1, 2)

Source code in src/pyochain/_iter.py
1678
1679
1680
1681
1682
1683
1684
1685
1686
1687
1688
1689
1690
1691
1692
1693
1694
1695
1696
1697
1698
1699
1700
1701
1702
1703
1704
1705
1706
1707
1708
1709
1710
1711
1712
1713
1714
1715
def map_while[R](self, func: Callable[[T], Option[R]]) -> Iter[R]:
    """Creates an iterator that both yields elements based on a predicate and maps.

    `map_while()` takes a closure as an argument. It will call this closure on each element of
    the iterator, and yield elements while it returns `Some(_)`.

    After `NONE` is returned, `map_while()` stops and the rest of the elements are ignored.

    Args:
        func (Callable[[T], Option[R]]): Function to apply to each element that returns `Option[R]`.

    Returns:
        Iter[R]: An iterator of transformed elements until `NONE` is encountered.

    Example:
    ```python
    >>> import pyochain as pc
    >>> def checked_div(x: int) -> pc.Option[int]:
    ...     return pc.Some(16 // x) if x != 0 else pc.NONE
    >>>
    >>> data = pc.Iter([-1, 4, 0, 1])
    >>> data.map_while(checked_div).collect()
    Seq(-16, 4)
    >>> data = pc.Iter([0, 1, 2, -3, 4, 5, -6])
    >>> # Convert to positive ints, stop at first negative
    >>> data.map_while(lambda x: pc.Some(x) if x >= 0 else pc.NONE).collect()
    Seq(0, 1, 2)

    ```
    """

    def _gen() -> Generator[R]:
        for opt in map(func, self._inner):
            if opt.is_none():
                return
            yield opt.unwrap()

    return Iter(_gen())

map_windows(length, func)

map_windows(
    length: Literal[1], func: Callable[[tuple[T]], R]
) -> Iter[R]
map_windows(
    length: Literal[2], func: Callable[[tuple[T, T]], R]
) -> Iter[R]
map_windows(
    length: Literal[3], func: Callable[[tuple[T, T, T]], R]
) -> Iter[R]
map_windows(
    length: Literal[4],
    func: Callable[[tuple[T, T, T, T]], R],
) -> Iter[R]
map_windows(
    length: Literal[5],
    func: Callable[[tuple[T, T, T, T, T]], R],
) -> Iter[R]
map_windows(
    length: Literal[6],
    func: Callable[[tuple[T, T, T, T, T, T]], R],
) -> Iter[R]
map_windows(
    length: Literal[7],
    func: Callable[[tuple[T, T, T, T, T, T, T]], R],
) -> Iter[R]
map_windows(
    length: Literal[8],
    func: Callable[[tuple[T, T, T, T, T, T, T, T]], R],
) -> Iter[R]
map_windows(
    length: Literal[9],
    func: Callable[[tuple[T, T, T, T, T, T, T, T, T]], R],
) -> Iter[R]
map_windows(
    length: Literal[10],
    func: Callable[
        [tuple[T, T, T, T, T, T, T, T, T, T]], R
    ],
) -> Iter[R]
map_windows(
    length: int, func: Callable[[tuple[T, ...]], R]
) -> Iter[R]

Calls the given func for each contiguous window of size length over self.

The windows during mapping overlaps.

The provided function is called with the entire window as a single tuple argument.

Parameters:

Name Type Description Default
length int

The length of each window.

required
func Callable[[tuple[Any, ...]], R]

Function to apply to each window.

required

Returns:

Type Description
Iter[R]

Iter[R]: An iterator over the outputs of func.

See Also

.map_windows_star() for a version that unpacks the window into separate arguments.

Example:

>>> import pyochain as pc
>>> import statistics
>>> pc.Iter([1, 2, 3, 4]).map_windows(2, statistics.mean).collect()
Seq(1.5, 2.5, 3.5)
>>> pc.Iter("abcd").map_windows(3, lambda window: "".join(window).upper()).collect()
Seq('ABC', 'BCD')
>>> pc.Iter([10, 20, 30, 40, 50]).map_windows(4, sum).collect()
Seq(100, 140)
>>> from pathlib import Path
>>> pc.Iter(["home", "src", "pyochain"]).map_windows(2, lambda p: str(Path(*p))).collect()
Seq('home\\src', 'src\\pyochain')

Source code in src/pyochain/_iter.py
2499
2500
2501
2502
2503
2504
2505
2506
2507
2508
2509
2510
2511
2512
2513
2514
2515
2516
2517
2518
2519
2520
2521
2522
2523
2524
2525
2526
2527
2528
2529
2530
2531
2532
2533
2534
2535
def map_windows[R](
    self, length: int, func: Callable[[tuple[Any, ...]], R]
) -> Iter[R]:
    r"""Calls the given *func* for each contiguous window of size *length* over **self**.

    The windows during mapping overlaps.

    The provided function is called with the entire window as a single tuple argument.

    Args:
        length (int): The length of each window.
        func (Callable[[tuple[Any, ...]], R]): Function to apply to each window.

    Returns:
        Iter[R]: An iterator over the outputs of func.

    See Also:
        `.map_windows_star()` for a version that unpacks the window into separate arguments.

    Example:
    ```python
    >>> import pyochain as pc
    >>> import statistics
    >>> pc.Iter([1, 2, 3, 4]).map_windows(2, statistics.mean).collect()
    Seq(1.5, 2.5, 3.5)
    >>> pc.Iter("abcd").map_windows(3, lambda window: "".join(window).upper()).collect()
    Seq('ABC', 'BCD')
    >>> pc.Iter([10, 20, 30, 40, 50]).map_windows(4, sum).collect()
    Seq(100, 140)
    >>> from pathlib import Path
    >>> pc.Iter(["home", "src", "pyochain"]).map_windows(2, lambda p: str(Path(*p))).collect()
    Seq('home\\src', 'src\\pyochain')


    ```
    """
    return Iter(map(func, cz.itertoolz.sliding_window(length, self._inner)))

map_windows_star(length, func)

map_windows_star(
    length: Literal[1], func: Callable[[T], R]
) -> Iter[R]
map_windows_star(
    length: Literal[2], func: Callable[[T, T], R]
) -> Iter[R]
map_windows_star(
    length: Literal[3], func: Callable[[T, T, T], R]
) -> Iter[R]
map_windows_star(
    length: Literal[4], func: Callable[[T, T, T, T], R]
) -> Iter[R]
map_windows_star(
    length: Literal[5], func: Callable[[T, T, T, T, T], R]
) -> Iter[R]
map_windows_star(
    length: Literal[6],
    func: Callable[[T, T, T, T, T, T], R],
) -> Iter[R]
map_windows_star(
    length: Literal[7],
    func: Callable[[T, T, T, T, T, T, T], R],
) -> Iter[R]
map_windows_star(
    length: Literal[8],
    func: Callable[[T, T, T, T, T, T, T, T], R],
) -> Iter[R]
map_windows_star(
    length: Literal[9],
    func: Callable[[T, T, T, T, T, T, T, T, T], R],
) -> Iter[R]
map_windows_star(
    length: Literal[10],
    func: Callable[[T, T, T, T, T, T, T, T, T, T], R],
) -> Iter[R]

Calls the given func for each contiguous window of size length over self.

The windows during mapping overlaps.

The provided function is called with each element of the window as separate arguments.

Parameters:

Name Type Description Default
length int

The length of each window.

required
func Callable[..., R]

Function to apply to each window.

required

Returns:

Type Description
Iter[R]

Iter[R]: An iterator over the outputs of func.

See Also

.map_windows() for a version that passes the entire window as a single tuple argument.

Example:

>>> import pyochain as pc
>>> pc.Iter("abcd").map_windows_star(2, lambda x, y: f"{x}+{y}").collect()
Seq('a+b', 'b+c', 'c+d')
>>> pc.Iter([1, 2, 3, 4]).map_windows_star(2, lambda x, y: x + y).collect()
Seq(3, 5, 7)

Source code in src/pyochain/_iter.py
2577
2578
2579
2580
2581
2582
2583
2584
2585
2586
2587
2588
2589
2590
2591
2592
2593
2594
2595
2596
2597
2598
2599
2600
2601
2602
2603
2604
2605
2606
def map_windows_star[R](self, length: int, func: Callable[..., R]) -> Iter[R]:
    """Calls the given *func* for each contiguous window of size *length* over **self**.

    The windows during mapping overlaps.

    The provided function is called with each element of the window as separate arguments.

    Args:
        length (int): The length of each window.
        func (Callable[..., R]): Function to apply to each window.

    Returns:
        Iter[R]: An iterator over the outputs of func.

    See Also:
        `.map_windows()` for a version that passes the entire window as a single tuple argument.

    Example:
    ```python
    >>> import pyochain as pc
    >>> pc.Iter("abcd").map_windows_star(2, lambda x, y: f"{x}+{y}").collect()
    Seq('a+b', 'b+c', 'c+d')
    >>> pc.Iter([1, 2, 3, 4]).map_windows_star(2, lambda x, y: x + y).collect()
    Seq(3, 5, 7)

    ```
    """
    return Iter(
        itertools.starmap(func, cz.itertoolz.sliding_window(length, self._inner))
    )

most_common(n=None)

Return the n most common elements and their counts from the Iterator.

If n is None, then all elements are returned.

Parameters:

Name Type Description Default
n int | None

Number of most common elements to return. Defaults to None (all elements).

None

Returns:

Type Description
Vec[tuple[T, int]]

Vec[tuple[T, int]]: A Vec containing tuples of (element, count).

Example:

>>> import pyochain as pc
>>> pc.Iter([1, 1, 2, 3, 3, 3]).most_common(2)
Vec((3, 3), (1, 2))

Source code in src/pyochain/_iter.py
3217
3218
3219
3220
3221
3222
3223
3224
3225
3226
3227
3228
3229
3230
3231
3232
3233
3234
3235
3236
3237
3238
def most_common(self, n: int | None = None) -> Vec[tuple[T, int]]:
    """Return the **n** most common elements and their counts from the `Iterator`.

    If **n** is `None`, then all elements are returned.

    Args:
        n (int | None): Number of most common elements to return. Defaults to None (all elements).

    Returns:
        Vec[tuple[T, int]]: A `Vec` containing tuples of (element, count).

    Example:
    ```python
    >>> import pyochain as pc
    >>> pc.Iter([1, 1, 2, 3, 3, 3]).most_common(2)
    Vec((3, 3), (1, 2))

    ```
    """
    from collections import Counter

    return Vec.from_ref(Counter(self._inner).most_common(n))

once(value) staticmethod

Create an Iter that yields a single value.

If you have a function which works on iterators, but you only need to process one value, you can use this method rather than doing something like Iter([value]).

This can be considered the equivalent of .insert() but as a constructor.

Parameters:

Name Type Description Default
value V

The single value to yield.

required

Returns:

Type Description
Iter[V]

Iter[V]: An iterator yielding the specified value.

Example:

>>> import pyochain as pc
>>> pc.Iter.once(42).collect()
Seq(42,)

Source code in src/pyochain/_iter.py
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
@staticmethod
def once[V](value: V) -> Iter[V]:
    """Create an `Iter` that yields a single value.

    If you have a function which works on iterators, but you only need to process one value, you can use this method rather than doing something like `Iter([value])`.

    This can be considered the equivalent of `.insert()` but as a constructor.

    Args:
        value (V): The single value to yield.

    Returns:
        Iter[V]: An iterator yielding the specified value.

    Example:
    ```python
    >>> import pyochain as pc
    >>> pc.Iter.once(42).collect()
    Seq(42,)

    ```
    """
    return Iter((value,))

once_with(func, *args, **kwargs) staticmethod

Create an Iter that lazily generates a value exactly once by invoking the provided closure.

If you have a function which works on iterators, but you only need to process one value, you can use this method rather than doing something like Iter([value]).

This can be considered the equivalent of .insert() but as a constructor.

Unlike .once(), this function will lazily generate the value on request.

Parameters:

Name Type Description Default
func Callable[P, R]

The single value to yield.

required
*args P.args

Positional arguments to pass to func.

()
**kwargs P.kwargs

Keyword arguments to pass to func.

{}

Returns:

Type Description
Iter[R]

Iter[R]: An iterator yielding the specified value.

Example:

>>> import pyochain as pc
>>> pc.Iter.once(42).collect()
Seq(42,)

Source code in src/pyochain/_iter.py
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
@staticmethod
def once_with[**P, R](
    func: Callable[P, R], *args: P.args, **kwargs: P.kwargs
) -> Iter[R]:
    """Create an `Iter`  that lazily generates a value exactly once by invoking the provided closure.

    If you have a function which works on iterators, but you only need to process one value, you can use this method rather than doing something like `Iter([value])`.

    This can be considered the equivalent of `.insert()` but as a constructor.

    Unlike `.once()`, this function will lazily generate the value on request.

    Args:
        func (Callable[P, R]): The single value to yield.
        *args (P.args): Positional arguments to pass to **func**.
        **kwargs (P.kwargs): Keyword arguments to pass to **func**.

    Returns:
        Iter[R]: An iterator yielding the specified value.

    Example:
    ```python
    >>> import pyochain as pc
    >>> pc.Iter.once(42).collect()
    Seq(42,)

    ```
    """

    def _once_with() -> Generator[R]:
        yield func(*args, **kwargs)

    return Iter(_once_with())

pairwise()

Return an iterator over pairs of consecutive elements.

Returns:

Type Description
Iter[tuple[T, T]]

Iter[tuple[T, T]]: An iterable of pairs of consecutive elements.

Example:

>>> import pyochain as pc
>>> pc.Iter([1, 2, 3]).pairwise().collect()
Seq((1, 2), (2, 3))

Source code in src/pyochain/_iter.py
2933
2934
2935
2936
2937
2938
2939
2940
2941
2942
2943
2944
2945
2946
2947
def pairwise(self) -> Iter[tuple[T, T]]:
    """Return an iterator over pairs of consecutive elements.

    Returns:
        Iter[tuple[T, T]]: An iterable of pairs of consecutive elements.

    Example:
    ```python
    >>> import pyochain as pc
    >>> pc.Iter([1, 2, 3]).pairwise().collect()
    Seq((1, 2), (2, 3))

    ```
    """
    return Iter(itertools.pairwise(self._inner))

partition(n, pad=None)

partition(
    n: Literal[1], pad: None = None
) -> Iter[tuple[T]]
partition(
    n: Literal[2], pad: None = None
) -> Iter[tuple[T, T]]
partition(
    n: Literal[3], pad: None = None
) -> Iter[tuple[T, T, T]]
partition(
    n: Literal[4], pad: None = None
) -> Iter[tuple[T, T, T, T]]
partition(
    n: Literal[5], pad: None = None
) -> Iter[tuple[T, T, T, T, T]]
partition(n: int, pad: T) -> Iter[tuple[T, ...]]

Partition self into tuples of length n.

Parameters:

Name Type Description Default
n int

Length of each partition.

required
pad T | None

Value to pad the last partition if needed.

None

Returns:

Type Description
Iter[tuple[T, ...]]

Iter[tuple[T, ...]]: An iterable of partitioned tuples.

Example:

>>> import pyochain as pc
>>> pc.Iter([1, 2, 3, 4]).partition(2).collect()
Seq((1, 2), (3, 4))
If the length of seq is not evenly divisible by n, the final tuple is dropped if pad is not specified, or filled to length n by pad:
>>> pc.Iter([1, 2, 3, 4, 5]).partition(2).collect()
Seq((1, 2), (3, 4), (5, None))

Source code in src/pyochain/_iter.py
2624
2625
2626
2627
2628
2629
2630
2631
2632
2633
2634
2635
2636
2637
2638
2639
2640
2641
2642
2643
2644
2645
2646
2647
2648
def partition(self, n: int, pad: T | None = None) -> Iter[tuple[T, ...]]:
    """Partition **self** into `tuples` of length **n**.

    Args:
        n (int): Length of each partition.
        pad (T | None): Value to pad the last partition if needed.

    Returns:
        Iter[tuple[T, ...]]: An iterable of partitioned tuples.

    Example:
    ```python
    >>> import pyochain as pc
    >>> pc.Iter([1, 2, 3, 4]).partition(2).collect()
    Seq((1, 2), (3, 4))

    ```
    If the length of seq is not evenly divisible by n, the final tuple is dropped if pad is not specified, or filled to length n by pad:
    ```python
    >>> pc.Iter([1, 2, 3, 4, 5]).partition(2).collect()
    Seq((1, 2), (3, 4), (5, None))

    ```
    """
    return Iter(cz.itertoolz.partition(n, self._inner, pad=pad))

partition_all(n)

Partition all elements of sequence into tuples of length at most n.

The final tuple may be shorter to accommodate extra elements.

Parameters:

Name Type Description Default
n int

Maximum length of each partition.

required

Returns:

Type Description
Iter[tuple[T, ...]]

Iter[tuple[T, ...]]: An iterable of partitioned tuples.

Example:

>>> import pyochain as pc
>>> pc.Iter([1, 2, 3, 4]).partition_all(2).collect()
Seq((1, 2), (3, 4))
>>> pc.Iter([1, 2, 3, 4, 5]).partition_all(2).collect()
Seq((1, 2), (3, 4), (5,))

Source code in src/pyochain/_iter.py
2650
2651
2652
2653
2654
2655
2656
2657
2658
2659
2660
2661
2662
2663
2664
2665
2666
2667
2668
2669
2670
2671
def partition_all(self, n: int) -> Iter[tuple[T, ...]]:
    """Partition all elements of sequence into tuples of length at most n.

    The final tuple may be shorter to accommodate extra elements.

    Args:
        n (int): Maximum length of each partition.

    Returns:
        Iter[tuple[T, ...]]: An iterable of partitioned tuples.

    Example:
    ```python
    >>> import pyochain as pc
    >>> pc.Iter([1, 2, 3, 4]).partition_all(2).collect()
    Seq((1, 2), (3, 4))
    >>> pc.Iter([1, 2, 3, 4, 5]).partition_all(2).collect()
    Seq((1, 2), (3, 4), (5,))

    ```
    """
    return Iter(cz.itertoolz.partition_all(n, self._inner))

partition_by(predicate)

Partition the Iterator into a sequence of tuples according to a predicate function.

Every time the output of predicate changes, a new tuple is started, and subsequent items are collected into that tuple.

Parameters:

Name Type Description Default
predicate Callable[[T], bool]

Function to determine partition boundaries.

required

Returns:

Type Description
Iter[tuple[T, ...]]

Iter[tuple[T, ...]]: An iterable of partitioned tuples.

Example:

>>> import pyochain as pc
>>> pc.Iter("I have space").partition_by(lambda c: c == " ").collect()
Seq(('I',), (' ',), ('h', 'a', 'v', 'e'), (' ',), ('s', 'p', 'a', 'c', 'e'))
>>>
>>> data = [1, 2, 1, 99, 88, 33, 99, -1, 5]
>>> pc.Iter(data).partition_by(lambda x: x > 10).collect()
Seq((1, 2, 1), (99, 88, 33, 99), (-1, 5))

Source code in src/pyochain/_iter.py
2673
2674
2675
2676
2677
2678
2679
2680
2681
2682
2683
2684
2685
2686
2687
2688
2689
2690
2691
2692
2693
2694
2695
2696
2697
def partition_by(self, predicate: Callable[[T], bool]) -> Iter[tuple[T, ...]]:
    """Partition the `Iterator` into a sequence of `tuples` according to a predicate function.

    Every time the output of `predicate` changes, a new `tuple` is started,
    and subsequent items are collected into that `tuple`.

    Args:
        predicate (Callable[[T], bool]): Function to determine partition boundaries.

    Returns:
        Iter[tuple[T, ...]]: An iterable of partitioned tuples.

    Example:
    ```python
    >>> import pyochain as pc
    >>> pc.Iter("I have space").partition_by(lambda c: c == " ").collect()
    Seq(('I',), (' ',), ('h', 'a', 'v', 'e'), (' ',), ('s', 'p', 'a', 'c', 'e'))
    >>>
    >>> data = [1, 2, 1, 99, 88, 33, 99, -1, 5]
    >>> pc.Iter(data).partition_by(lambda x: x > 10).collect()
    Seq((1, 2, 1), (99, 88, 33, 99), (-1, 5))

    ```
    """
    return Iter(cz.recipes.partitionby(predicate, self._inner))

peekable(n)

Retrieve the next n elements from the Iterator, and return a Seq of the retrieved elements along with the original Iterator, unconsumed.

The returned Peekable object contains two attributes: - peek: A Seq of the next n elements. - values: An Iter that includes the peeked elements followed by the remaining elements of the original Iterator.

Peekable implement Checkable on the peek attribute.

Parameters:

Name Type Description Default
n int

Number of items to peek.

required

Returns:

Type Description
Peekable[T]

Peekable[T]: A Peekable object containing the peeked elements and the remaining iterator.

See Also

Iter.cloned() to create an independent copy of the iterator.

Example:

>>> import pyochain as pc
>>> data = pc.Iter([1, 2, 3]).peekable(2)
>>> data.peek
Seq(1, 2)
>>> data.values.collect()
Seq(1, 2, 3)

Source code in src/pyochain/_iter.py
2723
2724
2725
2726
2727
2728
2729
2730
2731
2732
2733
2734
2735
2736
2737
2738
2739
2740
2741
2742
2743
2744
2745
2746
2747
2748
2749
2750
2751
2752
2753
def peekable(self, n: int) -> Peekable[T]:
    """Retrieve the next **n** elements from the `Iterator`, and return a `Seq` of the retrieved elements along with the original `Iterator`, unconsumed.

    The returned `Peekable` object contains two attributes:
    - *peek*: A `Seq` of the next **n** elements.
    - *values*: An `Iter` that includes the peeked elements followed by the remaining elements of the original `Iterator`.

    `Peekable` implement `Checkable` on the *peek* attribute.

    Args:
        n (int): Number of items to peek.

    Returns:
        Peekable[T]: A `Peekable` object containing the peeked elements and the remaining iterator.

    See Also:
        `Iter.cloned()` to create an independent copy of the iterator.

    Example:
    ```python
    >>> import pyochain as pc
    >>> data = pc.Iter([1, 2, 3]).peekable(2)
    >>> data.peek
    Seq(1, 2)
    >>> data.values.collect()
    Seq(1, 2, 3)

    ```
    """
    peeked = Seq(itertools.islice(self._inner, n))
    return Peekable(peeked, Iter(itertools.chain(peeked, self._inner)))

permutations(r=None)

permutations(r: Literal[2]) -> Iter[tuple[T, T]]
permutations(r: Literal[3]) -> Iter[tuple[T, T, T]]
permutations(r: Literal[4]) -> Iter[tuple[T, T, T, T]]
permutations(r: Literal[5]) -> Iter[tuple[T, T, T, T, T]]

Return all permutations of length r.

Parameters:

Name Type Description Default
r int | None

Length of each permutation. Defaults to the length of the iterable.

None

Returns:

Type Description
Iter[tuple[T, ...]]

Iter[tuple[T, ...]]: An iterable of permutations.

Example:

>>> import pyochain as pc
>>> pc.Iter([1, 2, 3]).permutations(2).collect()
Seq((1, 2), (1, 3), (2, 1), (2, 3), (3, 1), (3, 2))

Source code in src/pyochain/_iter.py
2881
2882
2883
2884
2885
2886
2887
2888
2889
2890
2891
2892
2893
2894
2895
2896
2897
2898
def permutations(self, r: int | None = None) -> Iter[tuple[T, ...]]:
    """Return all permutations of length r.

    Args:
        r (int | None): Length of each permutation. Defaults to the length of the iterable.

    Returns:
        Iter[tuple[T, ...]]: An iterable of permutations.

    Example:
    ```python
    >>> import pyochain as pc
    >>> pc.Iter([1, 2, 3]).permutations(2).collect()
    Seq((1, 2), (1, 3), (2, 1), (2, 3), (3, 1), (3, 2))

    ```
    """
    return Iter(itertools.permutations(self._inner, r))

product(*others)

product() -> Iter[tuple[T]]
product(iter1: Iterable[T1]) -> Iter[tuple[T, T1]]
product(
    iter1: Iterable[T1], iter2: Iterable[T2]
) -> Iter[tuple[T, T1, T2]]
product(
    iter1: Iterable[T1],
    iter2: Iterable[T2],
    iter3: Iterable[T3],
) -> Iter[tuple[T, T1, T2, T3]]
product(
    iter1: Iterable[T1],
    iter2: Iterable[T2],
    iter3: Iterable[T3],
    iter4: Iterable[T4],
) -> Iter[tuple[T, T1, T2, T3, T4]]

Computes the Cartesian product with another iterable.

This is the declarative equivalent of nested for-loops.

It pairs every element from the source iterable with every element from the other iterable.

Parameters:

Name Type Description Default
*others Iterable[Any]

Other iterables to compute the Cartesian product with.

()

Returns:

Type Description
Iter[tuple[Any, ...]]

Iter[tuple[Any, ...]]: An iterable of tuples containing elements from the Cartesian product.

Example:

>>> import pyochain as pc
>>> pc.Iter(["blue", "red"]).product(["S", "M"]).collect()
Seq(('blue', 'S'), ('blue', 'M'), ('red', 'S'), ('red', 'M'))
>>> res = (
...     pc.Iter(["blue", "red"])
...     .product(["S", "M"])
...     .map_star(lambda color, size: f"{color}-{size}")
...     .collect()
... )
>>> res
Seq('blue-S', 'blue-M', 'red-S', 'red-M')
>>> res = (
...     pc.Iter([1, 2, 3])
...     .product([10, 20])
...     .filter_star(lambda a, b: a * b >= 40)
...     .map_star(lambda a, b: a * b)
...     .collect()
... )
>>> res
Seq(40, 60)
>>> res = (
...     pc.Iter([1])
...     .product(["a", "b"], [True])
...     .filter_star(lambda _a, b, _c: b != "a")
...     .map_star(lambda a, b, c: f"{a}{b} is {c}")
...     .collect()
... )
>>> res
Seq('1b is True',)

Source code in src/pyochain/_iter.py
2348
2349
2350
2351
2352
2353
2354
2355
2356
2357
2358
2359
2360
2361
2362
2363
2364
2365
2366
2367
2368
2369
2370
2371
2372
2373
2374
2375
2376
2377
2378
2379
2380
2381
2382
2383
2384
2385
2386
2387
2388
2389
2390
2391
2392
2393
2394
2395
2396
def product(self, *others: Iterable[Any]) -> Iter[tuple[Any, ...]]:
    """Computes the Cartesian product with another iterable.

    This is the declarative equivalent of nested for-loops.

    It pairs every element from the source iterable with every element from the
    other iterable.

    Args:
        *others (Iterable[Any]): Other iterables to compute the Cartesian product with.

    Returns:
        Iter[tuple[Any, ...]]: An iterable of tuples containing elements from the Cartesian product.

    Example:
    ```python
    >>> import pyochain as pc
    >>> pc.Iter(["blue", "red"]).product(["S", "M"]).collect()
    Seq(('blue', 'S'), ('blue', 'M'), ('red', 'S'), ('red', 'M'))
    >>> res = (
    ...     pc.Iter(["blue", "red"])
    ...     .product(["S", "M"])
    ...     .map_star(lambda color, size: f"{color}-{size}")
    ...     .collect()
    ... )
    >>> res
    Seq('blue-S', 'blue-M', 'red-S', 'red-M')
    >>> res = (
    ...     pc.Iter([1, 2, 3])
    ...     .product([10, 20])
    ...     .filter_star(lambda a, b: a * b >= 40)
    ...     .map_star(lambda a, b: a * b)
    ...     .collect()
    ... )
    >>> res
    Seq(40, 60)
    >>> res = (
    ...     pc.Iter([1])
    ...     .product(["a", "b"], [True])
    ...     .filter_star(lambda _a, b, _c: b != "a")
    ...     .map_star(lambda a, b, c: f"{a}{b} is {c}")
    ...     .collect()
    ... )
    >>> res
    Seq('1b is True',)

    ```
    """
    return Iter(itertools.product(self._inner, *others))

repeat(n=None)

Repeat the entire Iter n times (as elements).

If n is None, repeat indefinitely.

Operates lazily, hence if you need to get the underlying elements, you will need to collect each repeated Iter via .map(lambda x: x.collect()) or similar.

Warning

If n is None, this will create an infinite Iterator.

Be sure to use Iter.take() or Iter.slice() to limit the number of items taken.

See Also

Iter.cycle() to repeat the elements of the Iter indefinitely.

Parameters:

Name Type Description Default
n int | None

Optional number of repetitions.

None

Returns:

Type Description
Iter[Self]

Iter[Self]: An Iter of repeated Iter.

Example:

>>> import pyochain as pc
>>> pc.Iter([1, 2]).repeat(3).map(list).collect()
Seq([1, 2], [1, 2], [1, 2])

Source code in src/pyochain/_iter.py
1717
1718
1719
1720
1721
1722
1723
1724
1725
1726
1727
1728
1729
1730
1731
1732
1733
1734
1735
1736
1737
1738
1739
1740
1741
1742
1743
1744
1745
1746
1747
1748
1749
1750
1751
1752
1753
1754
1755
1756
1757
def repeat(self, n: int | None = None) -> Iter[Self]:
    """Repeat the entire `Iter` **n** times (as elements).

    If **n** is `None`, repeat indefinitely.

    Operates lazily, hence if you need to get the underlying elements, you will need to collect each repeated `Iter` via `.map(lambda x: x.collect())` or similar.

    Warning:
        If **n** is `None`, this will create an infinite `Iterator`.

        Be sure to use `Iter.take()` or `Iter.slice()` to limit the number of items taken.

    See Also:
        `Iter.cycle()` to repeat the *elements* of the `Iter` indefinitely.

    Args:
        n (int | None): Optional number of repetitions.

    Returns:
        Iter[Self]: An `Iter` of repeated `Iter`.

    Example:
    ```python
    >>> import pyochain as pc
    >>> pc.Iter([1, 2]).repeat(3).map(list).collect()
    Seq([1, 2], [1, 2], [1, 2])

    ```
    """
    new = self.__class__

    def _repeat_infinite() -> Generator[Self]:
        tee = functools.partial(itertools.tee, self._inner, 1)
        iterators = tee()
        while True:
            yield new(iterators[0])
            iterators = tee()

    if n is None:
        return Iter(_repeat_infinite())
    return Iter(map(new, itertools.tee(self._inner, n)))

scan(initial, func)

Transform elements by sharing state between iterations.

scan takes two arguments: - an initial value which seeds the internal state - a func with two arguments

The first being a reference to the internal state and the second an iterator element.

The func can assign to the internal state to share state between iterations.

On iteration, the func will be applied to each element of the iterator and the return value from the func, an Option, is returned by the next method.

Thus the func can return Some(value) to yield value, or NONE to end the iteration.

Parameters:

Name Type Description Default
initial U

Initial state.

required
func Callable[[U, T], Option[U]]

Function that takes the current state and an item, and returns an Option.

required

Returns:

Type Description
Iter[U]

Iter[U]: An iterable of the yielded values.

Example:

>>> import pyochain as pc
>>> def accumulate_until_limit(state: int, item: int) -> pc.Option[int]:
...     new_state = state + item
...     match new_state:
...         case _ if new_state <= 10:
...             return pc.Some(new_state)
...         case _:
...             return pc.NONE
>>> pc.Iter([1, 2, 3, 4, 5]).scan(0, accumulate_until_limit).collect()
Seq(1, 3, 6, 10)

Source code in src/pyochain/_iter.py
1759
1760
1761
1762
1763
1764
1765
1766
1767
1768
1769
1770
1771
1772
1773
1774
1775
1776
1777
1778
1779
1780
1781
1782
1783
1784
1785
1786
1787
1788
1789
1790
1791
1792
1793
1794
1795
1796
1797
1798
1799
1800
1801
1802
1803
1804
1805
1806
def scan[U](self, initial: U, func: Callable[[U, T], Option[U]]) -> Iter[U]:
    """Transform elements by sharing state between iterations.

    `scan` takes two arguments:
        - an **initial** value which seeds the internal state
        - a **func** with two arguments

    The first being a reference to the internal state and the second an iterator element.

    The **func** can assign to the internal state to share state between iterations.

    On iteration, the **func** will be applied to each element of the iterator and the return value from the func, an Option, is returned by the next method.

    Thus the **func** can return `Some(value)` to yield value, or `NONE` to end the iteration.

    Args:
        initial (U): Initial state.
        func (Callable[[U, T], Option[U]]): Function that takes the current state and an item, and returns an Option.

    Returns:
        Iter[U]: An iterable of the yielded values.

    Example:
    ```python
    >>> import pyochain as pc
    >>> def accumulate_until_limit(state: int, item: int) -> pc.Option[int]:
    ...     new_state = state + item
    ...     match new_state:
    ...         case _ if new_state <= 10:
    ...             return pc.Some(new_state)
    ...         case _:
    ...             return pc.NONE
    >>> pc.Iter([1, 2, 3, 4, 5]).scan(0, accumulate_until_limit).collect()
    Seq(1, 3, 6, 10)

    ```
    """

    def _gen(data: Iterable[T]) -> Iterator[U]:
        current: U = initial
        for item in data:
            res = func(current, item)
            if res.is_none():
                break
            current = res.unwrap()
            yield res.unwrap()

    return Iter(_gen(self._inner))

sort(*, key=None, reverse=False)

sort(*, key: None = None, reverse: bool = False) -> Vec[U]
sort(
    *,
    key: Callable[[T], SupportsRichComparison[Any]],
    reverse: bool = False,
) -> Vec[T]
sort(*, key: None = None, reverse: bool = False) -> Never

Sort the elements of the sequence.

If a key function is provided, it is used to extract a comparison key from each element.

Note

This method must consume the entire Iter to perform the sort. The result is a new Vec over the sorted sequence.

Parameters:

Name Type Description Default
key Callable[[T], SupportsRichComparison[Any]] | None

Function to extract a comparison key from each element.

None
reverse bool

Whether to sort in descending order.

False

Returns:

Type Description
Vec[Any]

Vec[Any]: A Vec with elements sorted.

Example:

>>> import pyochain as pc
>>> pc.Iter([3, 1, 2]).sort()
Vec(1, 2, 3)

Source code in src/pyochain/_iter.py
3147
3148
3149
3150
3151
3152
3153
3154
3155
3156
3157
3158
3159
3160
3161
3162
3163
3164
3165
3166
3167
3168
3169
3170
3171
3172
3173
3174
3175
3176
def sort(
    self,
    *,
    key: Callable[[T], SupportsRichComparison[Any]] | None = None,
    reverse: bool = False,
) -> Vec[Any]:
    """Sort the elements of the sequence.

    If a key function is provided, it is used to extract a comparison key from each element.

    Note:
        This method must consume the entire `Iter` to perform the sort.
        The result is a new `Vec` over the sorted sequence.

    Args:
        key (Callable[[T], SupportsRichComparison[Any]] | None): Function to extract a comparison key from each element.
        reverse (bool): Whether to sort in descending order.

    Returns:
        Vec[Any]: A `Vec` with elements sorted.

    Example:
    ```python
    >>> import pyochain as pc
    >>> pc.Iter([3, 1, 2]).sort()
    Vec(1, 2, 3)

    ```
    """
    return Vec.from_ref(sorted(self._inner, reverse=reverse, key=key))

split_after(predicate, max_split=-1)

Yield iterator of items from iterable, where each iterator ends with an item where predicate returns True.

By default, no limit is placed on the number of splits.

Parameters:

Name Type Description Default
predicate Callable[[T], bool]

Function to determine the split points.

required
max_split int

Maximum number of splits to perform.

-1

Returns:

Type Description
Iter[Self]

Iter[Self]: An iterable of lists of items.

Example:

>>> import pyochain as pc
>>> pc.Iter("one1two2").split_after(str.isdigit).map(list).collect()
Seq(['o', 'n', 'e', '1'], ['t', 'w', 'o', '2'])

>>> def cond(n: int) -> bool:
...     return n % 3 == 0
>>>
>>> pc.Iter(range(10)).split_after(cond).map(list).collect()
Seq([0], [1, 2, 3], [4, 5, 6], [7, 8, 9])
>>> pc.Iter(range(10)).split_after(cond, max_split=2).map(list).collect()
Seq([0], [1, 2, 3], [4, 5, 6, 7, 8, 9])

Source code in src/pyochain/_iter.py
1408
1409
1410
1411
1412
1413
1414
1415
1416
1417
1418
1419
1420
1421
1422
1423
1424
1425
1426
1427
1428
1429
1430
1431
1432
1433
1434
1435
1436
1437
1438
1439
1440
1441
1442
1443
1444
1445
1446
1447
1448
1449
1450
1451
1452
1453
1454
1455
1456
1457
1458
1459
1460
1461
1462
1463
def split_after(
    self,
    predicate: Callable[[T], bool],
    max_split: int = -1,
) -> Iter[Self]:
    """Yield iterator of items from iterable, where each iterator ends with an item where `predicate` returns True.

    By default, no limit is placed on the number of splits.

    Args:
        predicate (Callable[[T], bool]): Function to determine the split points.
        max_split (int): Maximum number of splits to perform.

    Returns:
        Iter[Self]: An iterable of lists of items.

    Example:
    ```python
    >>> import pyochain as pc
    >>> pc.Iter("one1two2").split_after(str.isdigit).map(list).collect()
    Seq(['o', 'n', 'e', '1'], ['t', 'w', 'o', '2'])

    >>> def cond(n: int) -> bool:
    ...     return n % 3 == 0
    >>>
    >>> pc.Iter(range(10)).split_after(cond).map(list).collect()
    Seq([0], [1, 2, 3], [4, 5, 6], [7, 8, 9])
    >>> pc.Iter(range(10)).split_after(cond, max_split=2).map(list).collect()
    Seq([0], [1, 2, 3], [4, 5, 6, 7, 8, 9])

    ```
    """

    def _split_after(data: Iterator[T], max_split: int) -> Iterator[Self]:
        """Credits: more_itertools.split_after."""
        new = self.__class__
        if max_split == 0:
            yield new(data)
            return

        buf: list[T] = []
        for item in data:
            buf.append(item)
            if predicate(item) and buf:
                yield new(buf)
                if max_split == 1:
                    buf = list(data)
                    if buf:
                        yield new(buf)
                    return
                buf = []
                max_split -= 1
        if buf:
            yield new(buf)

    return Iter(_split_after(self._inner, max_split))

split_at(predicate, max_split=-1, *, keep_separator=False)

Yield iterators of items from iterable, where each iterator is delimited by an item where predicate returns True.

By default, no limit is placed on the number of splits.

Parameters:

Name Type Description Default
predicate Callable[[T], bool]

Function to determine the split points.

required
max_split int

Maximum number of splits to perform.

-1
keep_separator bool

Whether to include the separator in the output.

False

Returns:

Type Description
Iter[Self]

Iter[Self]: An iterator of iterators, each containing a segment of the original iterable.

By default, the delimiting items are not included in the output.

To include them, set keep_separator to True. At most max_split splits are done.

If max_split is not specified or -1, then there is no limit on the number of splits.

Example:

>>> import pyochain as pc
>>> def _to_res(x: pc.Iter[pc.Iter[str]]) -> pc.Seq[pc.Seq[str]]:
...     return x.map(lambda x: x.into(list)).collect()
>>>
>>> pc.Iter("abcdcba").split_at(lambda x: x == "b").into(_to_res)
Seq(['a'], ['c', 'd', 'c'], ['a'])
>>> pc.Iter(range(10)).split_at(lambda n: n % 2 == 1).into(_to_res)
Seq([0], [2], [4], [6], [8], [])
>>> pc.Iter(range(10)).split_at(lambda n: n % 2 == 1, max_split=2).into(_to_res)
Seq([0], [2], [4, 5, 6, 7, 8, 9])
>>>
>>> def cond(x: str) -> bool:
...     return x == "b"
>>>
>>> pc.Iter("abcdcba").split_at(cond, keep_separator=True).into(_to_res)
Seq(['a'], ['b'], ['c', 'd', 'c'], ['b'], ['a'])

Source code in src/pyochain/_iter.py
1336
1337
1338
1339
1340
1341
1342
1343
1344
1345
1346
1347
1348
1349
1350
1351
1352
1353
1354
1355
1356
1357
1358
1359
1360
1361
1362
1363
1364
1365
1366
1367
1368
1369
1370
1371
1372
1373
1374
1375
1376
1377
1378
1379
1380
1381
1382
1383
1384
1385
1386
1387
1388
1389
1390
1391
1392
1393
1394
1395
1396
1397
1398
1399
1400
1401
1402
1403
1404
1405
1406
def split_at(
    self,
    predicate: Callable[[T], bool],
    max_split: int = -1,
    *,
    keep_separator: bool = False,
) -> Iter[Self]:
    """Yield iterators of items from iterable, where each iterator is delimited by an item where `predicate` returns True.

    By default, no limit is placed on the number of splits.

    Args:
        predicate (Callable[[T], bool]): Function to determine the split points.
        max_split (int): Maximum number of splits to perform.
        keep_separator (bool): Whether to include the separator in the output.

    Returns:
        Iter[Self]: An iterator of iterators, each containing a segment of the original iterable.

    By default, the delimiting items are not included in the output.

    To include them, set *keep_separator* to `True`.
    At most *max_split* splits are done.

    If *max_split* is not specified or -1, then there is no limit on the number of splits.

    Example:
    ```python
    >>> import pyochain as pc
    >>> def _to_res(x: pc.Iter[pc.Iter[str]]) -> pc.Seq[pc.Seq[str]]:
    ...     return x.map(lambda x: x.into(list)).collect()
    >>>
    >>> pc.Iter("abcdcba").split_at(lambda x: x == "b").into(_to_res)
    Seq(['a'], ['c', 'd', 'c'], ['a'])
    >>> pc.Iter(range(10)).split_at(lambda n: n % 2 == 1).into(_to_res)
    Seq([0], [2], [4], [6], [8], [])
    >>> pc.Iter(range(10)).split_at(lambda n: n % 2 == 1, max_split=2).into(_to_res)
    Seq([0], [2], [4, 5, 6, 7, 8, 9])
    >>>
    >>> def cond(x: str) -> bool:
    ...     return x == "b"
    >>>
    >>> pc.Iter("abcdcba").split_at(cond, keep_separator=True).into(_to_res)
    Seq(['a'], ['b'], ['c', 'd', 'c'], ['b'], ['a'])

    ```
    """

    def _split_at(data: Iterator[T], max_split: int) -> Iterator[Self]:
        """Credits: more_itertools.split_at."""
        new = self.__class__
        if max_split == 0:
            yield self
            return

        buf: list[T] = []
        for item in data:
            if predicate(item):
                yield new(buf)
                if keep_separator:
                    yield new((item,))
                if max_split == 1:
                    yield new(data)
                    return
                buf = []
                max_split -= 1
            else:
                buf.append(item)
        yield new(buf)

    return Iter(_split_at(self._inner, max_split))

split_before(predicate, max_split=-1)

Yield iterator of items from iterable, where each iterator ends with an item where predicate returns True.

By default, no limit is placed on the number of splits.

Parameters:

Name Type Description Default
predicate Callable[[T], bool]

Function to determine the split points.

required
max_split int

Maximum number of splits to perform.

-1

Returns:

Type Description
Iter[Self]

Iter[Self]: An iterable of lists of items.

At most max_split are done.

If max_split is not specified or -1, then there is no limit on the number of splits:

Example:

>>> import pyochain as pc
>>> pc.Iter("abcdcba").split_before(lambda x: x == "b").map(list).collect()
Seq(['a'], ['b', 'c', 'd', 'c'], ['b', 'a'])
>>>
>>> def cond(n: int) -> bool:
...     return n % 2 == 1
>>>
>>> pc.Iter(range(10)).split_before(cond).map(list).collect()
Seq([0], [1, 2], [3, 4], [5, 6], [7, 8], [9])
>>> pc.Iter(range(10)).split_before(cond, max_split=2).map(list).collect()
Seq([0], [1, 2], [3, 4, 5, 6, 7, 8, 9])

Source code in src/pyochain/_iter.py
1465
1466
1467
1468
1469
1470
1471
1472
1473
1474
1475
1476
1477
1478
1479
1480
1481
1482
1483
1484
1485
1486
1487
1488
1489
1490
1491
1492
1493
1494
1495
1496
1497
1498
1499
1500
1501
1502
1503
1504
1505
1506
1507
1508
1509
1510
1511
1512
1513
1514
1515
1516
1517
1518
1519
1520
1521
1522
1523
1524
1525
def split_before(
    self,
    predicate: Callable[[T], bool],
    max_split: int = -1,
) -> Iter[Self]:
    """Yield iterator of items from iterable, where each iterator ends with an item where `predicate` returns True.

    By default, no limit is placed on the number of splits.

    Args:
        predicate (Callable[[T], bool]): Function to determine the split points.
        max_split (int): Maximum number of splits to perform.

    Returns:
        Iter[Self]: An iterable of lists of items.


    At most *max_split* are done.


    If *max_split* is not specified or -1, then there is no limit on the number of splits:

    Example:
    ```python
    >>> import pyochain as pc
    >>> pc.Iter("abcdcba").split_before(lambda x: x == "b").map(list).collect()
    Seq(['a'], ['b', 'c', 'd', 'c'], ['b', 'a'])
    >>>
    >>> def cond(n: int) -> bool:
    ...     return n % 2 == 1
    >>>
    >>> pc.Iter(range(10)).split_before(cond).map(list).collect()
    Seq([0], [1, 2], [3, 4], [5, 6], [7, 8], [9])
    >>> pc.Iter(range(10)).split_before(cond, max_split=2).map(list).collect()
    Seq([0], [1, 2], [3, 4, 5, 6, 7, 8, 9])

    ```
    """

    def _split_before(data: Iterator[T], max_split: int) -> Iterator[Self]:
        """Credits: more_itertools.split_before."""
        new = self.__class__

        if max_split == 0:
            yield new(data)
            return

        buf: list[T] = []
        for item in data:
            if predicate(item) and buf:
                yield new(buf)
                if max_split == 1:
                    yield new([item, *data])
                    return
                buf = []
                max_split -= 1
            buf.append(item)
        if buf:
            yield new(buf)

    return Iter(_split_before(self._inner, max_split))

split_into(*sizes)

Yield a list of sequential items from iterable of length 'n' for each integer 'n' in sizes.

Parameters:

Name Type Description Default
*sizes Option[int]

Some integers specifying the sizes of each chunk. Use NONE for the remainder.

()

Returns:

Type Description
Iter[Self]

Iter[Self]: An iterator of iterators, each containing a chunk of the original iterable.

If the sum of sizes is smaller than the length of iterable, then the remaining items of iterable will not be returned.

If the sum of sizes is larger than the length of iterable:

  • fewer items will be returned in the iteration that overruns the iterable
  • further lists will be empty

When a NONE object is encountered in sizes, the returned list will contain items up to the end of iterable the same way that itertools.slice does.

split_into can be useful for grouping a series of items where the sizes of the groups are not uniform.

An example would be where in a row from a table:

  • multiple columns represent elements of the same feature (e.g. a point represented by x,y,z)
  • the format is not the same for all columns.

Example:

>>> import pyochain as pc
>>> def _get_results(x: pc.Iter[pc.Iter[int]]) -> pc.Seq[pc.Seq[int]]:
...    return x.map(lambda x: x.collect()).collect()
>>>
>>> data = [1, 2, 3, 4, 5, 6]
>>> pc.Iter(data).split_into(pc.Some(1), pc.Some(2), pc.Some(3)).into(_get_results)
Seq(Seq(1,), Seq(2, 3), Seq(4, 5, 6))
>>> pc.Iter(data).split_into(pc.Some(2), pc.Some(3)).into(_get_results)
Seq(Seq(1, 2), Seq(3, 4, 5))
>>> pc.Iter([1, 2, 3, 4]).split_into(pc.Some(1), pc.Some(2), pc.Some(3), pc.Some(4)).into(_get_results)
Seq(Seq(1,), Seq(2, 3), Seq(4,), Seq())
>>> data = [1, 2, 3, 4, 5, 6, 7, 8, 9, 0]
>>> pc.Iter(data).split_into(pc.Some(2), pc.Some(3), pc.NONE).into(_get_results)
Seq(Seq(1, 2), Seq(3, 4, 5), Seq(6, 7, 8, 9, 0))

Source code in src/pyochain/_iter.py
1217
1218
1219
1220
1221
1222
1223
1224
1225
1226
1227
1228
1229
1230
1231
1232
1233
1234
1235
1236
1237
1238
1239
1240
1241
1242
1243
1244
1245
1246
1247
1248
1249
1250
1251
1252
1253
1254
1255
1256
1257
1258
1259
1260
1261
1262
1263
1264
1265
1266
1267
1268
1269
1270
1271
1272
def split_into(self, *sizes: Option[int]) -> Iter[Self]:
    """Yield a list of sequential items from iterable of length 'n' for each integer 'n' in sizes.

    Args:
        *sizes (Option[int]): `Some` integers specifying the sizes of each chunk. Use `NONE` for the remainder.

    Returns:
        Iter[Self]: An iterator of iterators, each containing a chunk of the original iterable.

    If the sum of sizes is smaller than the length of iterable, then the remaining items of iterable will not be returned.

    If the sum of sizes is larger than the length of iterable:

    - fewer items will be returned in the iteration that overruns the iterable
    - further lists will be empty

    When a `NONE` object is encountered in sizes, the returned list will contain items up to the end of iterable the same way that itertools.slice does.

    split_into can be useful for grouping a series of items where the sizes of the groups are not uniform.

    An example would be where in a row from a table:

    - multiple columns represent elements of the same feature (e.g. a point represented by x,y,z)
    - the format is not the same for all columns.

    Example:
    ```python
    >>> import pyochain as pc
    >>> def _get_results(x: pc.Iter[pc.Iter[int]]) -> pc.Seq[pc.Seq[int]]:
    ...    return x.map(lambda x: x.collect()).collect()
    >>>
    >>> data = [1, 2, 3, 4, 5, 6]
    >>> pc.Iter(data).split_into(pc.Some(1), pc.Some(2), pc.Some(3)).into(_get_results)
    Seq(Seq(1,), Seq(2, 3), Seq(4, 5, 6))
    >>> pc.Iter(data).split_into(pc.Some(2), pc.Some(3)).into(_get_results)
    Seq(Seq(1, 2), Seq(3, 4, 5))
    >>> pc.Iter([1, 2, 3, 4]).split_into(pc.Some(1), pc.Some(2), pc.Some(3), pc.Some(4)).into(_get_results)
    Seq(Seq(1,), Seq(2, 3), Seq(4,), Seq())
    >>> data = [1, 2, 3, 4, 5, 6, 7, 8, 9, 0]
    >>> pc.Iter(data).split_into(pc.Some(2), pc.Some(3), pc.NONE).into(_get_results)
    Seq(Seq(1, 2), Seq(3, 4, 5), Seq(6, 7, 8, 9, 0))

    ```
    """

    def _split_into(data: Iterator[T]) -> Iterator[Self]:
        """Credits: more_itertools.split_into."""
        new = self.__class__
        for size in sizes:
            if size.is_none():
                yield new(data)
                return
            else:
                yield new(itertools.islice(data, size.unwrap()))

    return Iter(_split_into(self._inner))

split_when(predicate, max_split=-1)

Split iterable into pieces based on the output of a predicate function.

By default, no limit is placed on the number of splits.

Parameters:

Name Type Description Default
predicate Callable[[T, T], bool]

Function that takes successive pairs of items and returns True if the iterable should be split.

required
max_split int

Maximum number of splits to perform.

-1

Returns:

Type Description
Iter[Self]

Iter[Self]: An iterator of iterators of items.

At most max_split splits are done.

If max_split is not specified or -1, then there is no limit on the number of splits.

The example below shows how to find runs of increasing numbers, by splitting the iterable when element i is larger than element i + 1.

Example:

>>> import pyochain as pc
>>> data = pc.Seq([1, 2, 3, 3, 2, 5, 2, 4, 2])
>>> data.iter().split_when(lambda x, y: x > y).map(lambda x: x.collect()).collect()
Seq(Seq(1, 2, 3, 3), Seq(2, 5), Seq(2, 4), Seq(2,))
>>> data.iter().split_when(lambda x, y: x > y, max_split=2).map(lambda x: x.collect()).collect()
Seq(Seq(1, 2, 3, 3), Seq(2, 5), Seq(2, 4, 2))

Source code in src/pyochain/_iter.py
1274
1275
1276
1277
1278
1279
1280
1281
1282
1283
1284
1285
1286
1287
1288
1289
1290
1291
1292
1293
1294
1295
1296
1297
1298
1299
1300
1301
1302
1303
1304
1305
1306
1307
1308
1309
1310
1311
1312
1313
1314
1315
1316
1317
1318
1319
1320
1321
1322
1323
1324
1325
1326
1327
1328
1329
1330
1331
1332
1333
1334
def split_when(
    self,
    predicate: Callable[[T, T], bool],
    max_split: int = -1,
) -> Iter[Self]:
    """Split iterable into pieces based on the output of a predicate function.

    By default, no limit is placed on the number of splits.

    Args:
        predicate (Callable[[T, T], bool]): Function that takes successive pairs of items and returns True if the iterable should be split.
        max_split (int): Maximum number of splits to perform.

    Returns:
        Iter[Self]: An iterator of iterators of items.

    At most *max_split* splits are done.

    If *max_split* is not specified or -1, then there is no limit on the number of splits.

    The example below shows how to find runs of increasing numbers, by splitting the iterable when element i is larger than element i + 1.

    Example:
    ```python
    >>> import pyochain as pc
    >>> data = pc.Seq([1, 2, 3, 3, 2, 5, 2, 4, 2])
    >>> data.iter().split_when(lambda x, y: x > y).map(lambda x: x.collect()).collect()
    Seq(Seq(1, 2, 3, 3), Seq(2, 5), Seq(2, 4), Seq(2,))
    >>> data.iter().split_when(lambda x, y: x > y, max_split=2).map(lambda x: x.collect()).collect()
    Seq(Seq(1, 2, 3, 3), Seq(2, 5), Seq(2, 4, 2))

    ```
    """

    def _split_when(data: Iterator[T], max_split: int) -> Iterator[Self]:
        """Credits: more_itertools.split_when."""
        new = self.__class__
        if max_split == 0:
            yield self
            return
        try:
            cur_item = next(data)
        except StopIteration:
            return

        buf = [cur_item]
        for next_item in data:
            if predicate(cur_item, next_item):
                yield new(buf)
                if max_split == 1:
                    yield new((next_item, *data))
                    return
                buf = []
                max_split -= 1

            buf.append(next_item)
            cur_item = next_item

        yield new(buf)

    return Iter(_split_when(self._inner, max_split))

successors(first, succ) staticmethod

Create an iterator of successive values computed from the previous one.

The iterator yields first (if it is Some), then repeatedly applies succ to the previous yielded value until it returns NONE.

Parameters:

Name Type Description Default
first Option[U]

Initial item.

required
succ Callable[[U], Option[U]]

Successor function.

required

Returns:

Type Description
Iter[U]

Iter[U]: Iterator yielding first and its successors.

Example:

>>> import pyochain as pc
>>> def next_pow10(x: int) -> pc.Option[int]:
...     return pc.Some(x * 10) if x < 10_000 else pc.NONE
>>> pc.Iter.successors(pc.Some(1), next_pow10).collect()
Seq(1, 10, 100, 1000, 10000)

Source code in src/pyochain/_iter.py
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
@staticmethod
def successors[U](first: Option[U], succ: Callable[[U], Option[U]]) -> Iter[U]:
    """Create an iterator of successive values computed from the previous one.

    The iterator yields `first` (if it is `Some`), then repeatedly applies **succ** to the
    previous yielded value until it returns `NONE`.

    Args:
        first (Option[U]): Initial item.
        succ (Callable[[U], Option[U]]): Successor function.

    Returns:
        Iter[U]: Iterator yielding `first` and its successors.

    Example:
    ```python
    >>> import pyochain as pc
    >>> def next_pow10(x: int) -> pc.Option[int]:
    ...     return pc.Some(x * 10) if x < 10_000 else pc.NONE
    >>> pc.Iter.successors(pc.Some(1), next_pow10).collect()
    Seq(1, 10, 100, 1000, 10000)

    ```
    """

    def _successors() -> Iterator[U]:
        current = first
        while current.is_some():
            value = current.unwrap()
            yield value
            current = succ(value)

    return Iter(_successors())

tail(n)

Return a Seq of the last n elements of the Iterator.

Parameters:

Name Type Description Default
n int

Number of elements to return.

required

Returns:

Type Description
Seq[T]

Seq[T]: A Seq containing the last n elements.

Example:

>>> import pyochain as pc
>>> pc.Iter([1, 2, 3]).tail(2)
Seq(2, 3)

Source code in src/pyochain/_iter.py
3178
3179
3180
3181
3182
3183
3184
3185
3186
3187
3188
3189
3190
3191
3192
3193
3194
3195
def tail(self, n: int) -> Seq[T]:
    """Return a `Seq` of the last **n** elements of the `Iterator`.

    Args:
        n (int): Number of elements to return.

    Returns:
        Seq[T]: A `Seq` containing the last **n** elements.

    Example:
    ```python
    >>> import pyochain as pc
    >>> pc.Iter([1, 2, 3]).tail(2)
    Seq(2, 3)

    ```
    """
    return Seq(cz.itertoolz.tail(n, self._inner))

top_n(n, key=None)

Return a tuple of the top-n items according to key.

Parameters:

Name Type Description Default
n int

Number of top elements to return.

required
key Callable[[T], Any] | None

Function to extract a comparison key from each element.

None

Returns:

Type Description
Seq[T]

Seq[T]: A new Seq containing the top-n elements.

Example:

>>> import pyochain as pc
>>> pc.Iter([1, 3, 2]).top_n(2)
Seq(3, 2)

Source code in src/pyochain/_iter.py
3197
3198
3199
3200
3201
3202
3203
3204
3205
3206
3207
3208
3209
3210
3211
3212
3213
3214
3215
def top_n(self, n: int, key: Callable[[T], Any] | None = None) -> Seq[T]:
    """Return a tuple of the top-n items according to key.

    Args:
        n (int): Number of top elements to return.
        key (Callable[[T], Any] | None): Function to extract a comparison key from each element.

    Returns:
        Seq[T]: A new Seq containing the top-n elements.

    Example:
    ```python
    >>> import pyochain as pc
    >>> pc.Iter([1, 3, 2]).top_n(2)
    Seq(3, 2)

    ```
    """
    return Seq(cz.itertoolz.topk(n, self._inner, key=key))

try_collect()

Fallibly transforms self into a Vec, short circuiting if a failure is encountered.

try_collect() is a variation of collect() that allows fallible conversions during collection.

Its main use case is simplifying conversions from iterators yielding Option[T] or Result[T, E] into Option[Vec[T]].

Also, if a failure is encountered during try_collect(), the Iter is still valid and may continue to be used, in which case it will continue iterating starting after the element that triggered the failure.

See the last example below for an example of how this works.

Note

This method return Vec[U] instead of being customizable, because the underlying data structure must be mutable in order to build up the collection.

Returns:

Type Description
Option[Vec[U]]

Option[Vec[U]]: Some[Vec[U]] if all elements were successfully collected, or NONE if a failure was encountered.

Example:

>>> import pyochain as pc
>>> # Successfully collecting an iterator of Option[int] into Option[Vec[int]]:
>>> pc.Iter([pc.Some(1), pc.Some(2), pc.Some(3)]).try_collect()
Some(Vec(1, 2, 3))
>>> # Failing to collect in the same way:
>>> pc.Iter([pc.Some(1), pc.Some(2), pc.NONE, pc.Some(3)]).try_collect()
NONE
>>> # A similar example, but with Result:
>>> pc.Iter([pc.Ok(1), pc.Ok(2), pc.Ok(3)]).try_collect()
Some(Vec(1, 2, 3))
>>> pc.Iter([pc.Ok(1), pc.Err("error"), pc.Ok(3)]).try_collect()
NONE
>>> def external_fn(x: int) -> pc.Option[int]:
...     if x % 2 == 0:
...         return pc.Some(x)
...     return pc.NONE
>>> pc.Iter([1, 2, 3, 4]).map(external_fn).try_collect()
NONE
>>> # Demonstrating that the iterator remains usable after a failure:
>>> it = pc.Iter([pc.Some(1), pc.NONE, pc.Some(3), pc.Some(4)])
>>> it.try_collect()
NONE
>>> it.try_collect()
Some(Vec(3, 4))

Source code in src/pyochain/_iter.py
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
954
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
973
974
975
976
977
978
979
980
981
982
983
984
985
986
987
def try_collect[U](self: Iter[Option[U]] | Iter[Result[U, Any]]) -> Option[Vec[U]]:
    """Fallibly transforms **self** into a `Vec`, short circuiting if a failure is encountered.

    `try_collect()` is a variation of `collect()` that allows fallible conversions during collection.

    Its main use case is simplifying conversions from iterators yielding `Option[T]` or `Result[T, E]` into `Option[Vec[T]]`.

    Also, if a failure is encountered during `try_collect()`, the `Iter` is still valid and may continue to be used, in which case it will continue iterating starting after the element that triggered the failure.

    See the last example below for an example of how this works.

    Note:
        This method return `Vec[U]` instead of being customizable, because the underlying data structure must be mutable in order to build up the collection.

    Returns:
        Option[Vec[U]]: `Some[Vec[U]]` if all elements were successfully collected, or `NONE` if a failure was encountered.

    Example:
    ```python
    >>> import pyochain as pc
    >>> # Successfully collecting an iterator of Option[int] into Option[Vec[int]]:
    >>> pc.Iter([pc.Some(1), pc.Some(2), pc.Some(3)]).try_collect()
    Some(Vec(1, 2, 3))
    >>> # Failing to collect in the same way:
    >>> pc.Iter([pc.Some(1), pc.Some(2), pc.NONE, pc.Some(3)]).try_collect()
    NONE
    >>> # A similar example, but with Result:
    >>> pc.Iter([pc.Ok(1), pc.Ok(2), pc.Ok(3)]).try_collect()
    Some(Vec(1, 2, 3))
    >>> pc.Iter([pc.Ok(1), pc.Err("error"), pc.Ok(3)]).try_collect()
    NONE
    >>> def external_fn(x: int) -> pc.Option[int]:
    ...     if x % 2 == 0:
    ...         return pc.Some(x)
    ...     return pc.NONE
    >>> pc.Iter([1, 2, 3, 4]).map(external_fn).try_collect()
    NONE
    >>> # Demonstrating that the iterator remains usable after a failure:
    >>> it = pc.Iter([pc.Some(1), pc.NONE, pc.Some(3), pc.Some(4)])
    >>> it.try_collect()
    NONE
    >>> it.try_collect()
    Some(Vec(3, 4))

    ```
    """
    collected: list[U] = []
    collected_add = collected.append
    for item in self._inner:
        match item:
            case Ok(val) | Some(val):
                collected_add(val)
            case _:
                return NONE
    return Some(Vec.from_ref(collected))

unique_to_each()

Return the elements from each of the iterators that aren't in the other iterators.

It is assumed that the elements of each iterable are hashable.

Credits

more_itertools.unique_to_each

Returns:

Type Description
Iter[Iter[U]]

Iter[Iter[U]]: An iterator of iterators, each containing the unique elements from the corresponding input iterable.

For example, suppose you have a set of packages, each with a set of dependencies:

{'pkg_1': {'A', 'B'}, 'pkg_2': {'B', 'C'}, 'pkg_3': {'B', 'D'}}

If you remove one package, which dependencies can also be removed?

If pkg_1 is removed, then A is no longer necessary - it is not associated with pkg_2 or pkg_3.

Similarly, C is only needed for pkg_2, and D is only needed for pkg_3:

>>> import pyochain as pc
>>> data = ({"A", "B"}, {"B", "C"}, {"B", "D"})
>>> pc.Iter(data).unique_to_each().map(lambda x: x.into(list)).collect()
Seq(['A'], ['C'], ['D'])

If there are duplicates in one input iterable that aren't in the others they will be duplicated in the output.

Input order is preserved:

>>> data = ("mississippi", "missouri")
>>> pc.Seq(data).iter().unique_to_each().map(lambda x: x.into(list)).collect()
Seq(['p', 'p'], ['o', 'u', 'r'])

Source code in src/pyochain/_iter.py
1169
1170
1171
1172
1173
1174
1175
1176
1177
1178
1179
1180
1181
1182
1183
1184
1185
1186
1187
1188
1189
1190
1191
1192
1193
1194
1195
1196
1197
1198
1199
1200
1201
1202
1203
1204
1205
1206
1207
1208
1209
1210
1211
1212
1213
1214
1215
def unique_to_each[U: Iterable[Any]](self: Iter[U]) -> Iter[Iter[U]]:
    """Return the elements from each of the iterators that aren't in the other iterators.

    It is assumed that the elements of each iterable are hashable.

    **Credits**

        more_itertools.unique_to_each

    Returns:
        Iter[Iter[U]]: An iterator of iterators, each containing the unique elements from the corresponding input iterable.

    For example, suppose you have a set of packages, each with a set of dependencies:

    **{'pkg_1': {'A', 'B'}, 'pkg_2': {'B', 'C'}, 'pkg_3': {'B', 'D'}}**

    If you remove one package, which dependencies can also be removed?

    If pkg_1 is removed, then A is no longer necessary - it is not associated with pkg_2 or pkg_3.

    Similarly, C is only needed for pkg_2, and D is only needed for pkg_3:

    ```python
    >>> import pyochain as pc
    >>> data = ({"A", "B"}, {"B", "C"}, {"B", "D"})
    >>> pc.Iter(data).unique_to_each().map(lambda x: x.into(list)).collect()
    Seq(['A'], ['C'], ['D'])

    ```

    If there are duplicates in one input iterable that aren't in the others they will be duplicated in the output.

    Input order is preserved:
    ```python
    >>> data = ("mississippi", "missouri")
    >>> pc.Seq(data).iter().unique_to_each().map(lambda x: x.into(list)).collect()
    Seq(['p', 'p'], ['o', 'u', 'r'])

    ```
    """
    from collections import Counter

    pool: tuple[Iterable[U], ...] = tuple(self._inner)
    counts: Counter[U] = Counter(itertools.chain.from_iterable(map(set, pool)))
    uniques: set[U] = {element for element in counts if counts[element] == 1}

    return Iter((Iter(filter(uniques.__contains__, it))) for it in pool)

unzip()

Converts an iterator of pairs into a pair of iterators.

Returns:

Type Description
Unzipped[U, V]

Unzipped[U, V]: dataclass with first and second iterators.

Returns an Unzipped dataclass, containing two iterators:

  • one from the left elements of the pairs
  • one from the right elements.

This function is, in some sense, the opposite of .zip().

Note

Both iterators share the same underlying source.

Values consumed by one iterator remain in the shared buffer until the other iterator consumes them too.

This is the unavoidable cost of having two independent iterators over the same source.

>>> import pyochain as pc
>>> data = [(1, "a"), (2, "b"), (3, "c")]
>>> unzipped = pc.Iter(data).unzip()
>>> unzipped.left.collect()
Seq(1, 2, 3)
>>> unzipped.right.collect()
Seq('a', 'b', 'c')
Source code in src/pyochain/_iter.py
2253
2254
2255
2256
2257
2258
2259
2260
2261
2262
2263
2264
2265
2266
2267
2268
2269
2270
2271
2272
2273
2274
2275
2276
2277
2278
2279
2280
2281
2282
2283
2284
2285
2286
def unzip[U, V](self: Iter[tuple[U, V]]) -> Unzipped[U, V]:
    """Converts an iterator of pairs into a pair of iterators.

    Returns:
        Unzipped[U, V]: dataclass with first and second iterators.


    Returns an `Unzipped` dataclass, containing two iterators:

    - one from the left elements of the pairs
    - one from the right elements.

    This function is, in some sense, the opposite of `.zip()`.

    Note:
        Both iterators share the same underlying source.

        Values consumed by one iterator remain in the shared buffer until the other iterator consumes them too.

        This is the unavoidable cost of having two independent iterators over the same source.

    ```python
    >>> import pyochain as pc
    >>> data = [(1, "a"), (2, "b"), (3, "c")]
    >>> unzipped = pc.Iter(data).unzip()
    >>> unzipped.left.collect()
    Seq(1, 2, 3)
    >>> unzipped.right.collect()
    Seq('a', 'b', 'c')

    ```
    """
    left, right = itertools.tee(self._inner, 2)
    return Unzipped(Iter(x[0] for x in left), Iter(x[1] for x in right))

with_position()

Return an iterable over (Position, T) tuples.

The Position indicates whether the item T is the first, middle, last, or only element in the iterable.

Returns:

Type Description
Iter[tuple[Position, T]]

Iter[tuple[Position, T]]: An iterable of (Position, item) tuples.

Example:

>>> import pyochain as pc
>>> pc.Iter(["a", "b", "c"]).with_position().collect()
Seq(('first', 'a'), ('middle', 'b'), ('last', 'c'))
>>> pc.Iter(["a"]).with_position().collect()
Seq(('only', 'a'),)

Source code in src/pyochain/_iter.py
2997
2998
2999
3000
3001
3002
3003
3004
3005
3006
3007
3008
3009
3010
3011
3012
3013
3014
3015
3016
3017
3018
3019
3020
3021
3022
3023
3024
3025
3026
3027
3028
3029
3030
3031
3032
3033
3034
3035
def with_position(self) -> Iter[tuple[Position, T]]:
    """Return an iterable over (`Position`, `T`) tuples.

    The `Position` indicates whether the item `T` is the first, middle, last, or only element in the iterable.

    Returns:
        Iter[tuple[Position, T]]: An iterable of (`Position`, item) tuples.

    Example:
    ```python
    >>> import pyochain as pc
    >>> pc.Iter(["a", "b", "c"]).with_position().collect()
    Seq(('first', 'a'), ('middle', 'b'), ('last', 'c'))
    >>> pc.Iter(["a"]).with_position().collect()
    Seq(('only', 'a'),)

    ```
    """

    def _gen(data: Iterator[T]) -> Iterator[tuple[Position, T]]:
        try:
            first = next(data)
        except StopIteration:
            return

        try:
            second = next(data)
        except StopIteration:
            yield ("only", first)
            return
        yield ("first", first)

        current: T = second
        for nxt in self._inner:
            yield ("middle", current)
            current = nxt
        yield ("last", current)

    return Iter(_gen(self._inner))

zip(*others, strict=False)

zip(
    iter1: Iterable[T1], /, *, strict: bool = ...
) -> Iter[tuple[T, T1]]
zip(
    iter1: Iterable[T1],
    iter2: Iterable[T2],
    /,
    *,
    strict: bool = ...,
) -> Iter[tuple[T, T1, T2]]
zip(
    iter1: Iterable[T1],
    iter2: Iterable[T2],
    iter3: Iterable[T3],
    /,
    *,
    strict: bool = ...,
) -> Iter[tuple[T, T1, T2, T3]]
zip(
    iter1: Iterable[T1],
    iter2: Iterable[T2],
    iter3: Iterable[T3],
    iter4: Iterable[T4],
    /,
    *,
    strict: bool = ...,
) -> Iter[tuple[T, T1, T2, T3, T4]]

Yields n-length tuples, where n is the number of iterables passed as positional arguments.

The i-th element in every tuple comes from the i-th iterable argument to .zip().

This continues until the shortest argument is exhausted.

Note

Iter.map_star can then be used for subsequent operations on the index and value, in a destructuring manner. This keep the code clean and readable, without index access like [0] and [1] for inline lambdas.

Parameters:

Name Type Description Default
*others Iterable[Any]

Other iterables to zip with.

()
strict bool

If True and one of the arguments is exhausted before the others, raise a ValueError.

False

Returns:

Type Description
Iter[tuple[Any, ...]]

Iter[tuple[Any, ...]]: An Iter of tuples containing elements from the zipped Iter and other iterables.

Example:

>>> import pyochain as pc
>>> pc.Iter([1, 2]).zip([10, 20]).collect()
Seq((1, 10), (2, 20))
>>> pc.Iter(["a", "b"]).zip([1, 2, 3]).collect()
Seq(('a', 1), ('b', 2))

Source code in src/pyochain/_iter.py
2144
2145
2146
2147
2148
2149
2150
2151
2152
2153
2154
2155
2156
2157
2158
2159
2160
2161
2162
2163
2164
2165
2166
2167
2168
2169
2170
2171
2172
2173
2174
2175
2176
def zip(
    self,
    *others: Iterable[Any],
    strict: bool = False,
) -> Iter[tuple[Any, ...]]:
    """Yields n-length tuples, where n is the number of iterables passed as positional arguments.

    The i-th element in every tuple comes from the i-th iterable argument to `.zip()`.

    This continues until the shortest argument is exhausted.

    Note:
        `Iter.map_star` can then be used for subsequent operations on the index and value, in a destructuring manner.
        This keep the code clean and readable, without index access like `[0]` and `[1]` for inline lambdas.

    Args:
        *others (Iterable[Any]): Other iterables to zip with.
        strict (bool): If `True` and one of the arguments is exhausted before the others, raise a ValueError.

    Returns:
        Iter[tuple[Any, ...]]: An `Iter` of tuples containing elements from the zipped Iter and other iterables.

    Example:
    ```python
    >>> import pyochain as pc
    >>> pc.Iter([1, 2]).zip([10, 20]).collect()
    Seq((1, 10), (2, 20))
    >>> pc.Iter(["a", "b"]).zip([1, 2, 3]).collect()
    Seq(('a', 1), ('b', 2))

    ```
    """
    return Iter(zip(self._inner, *others, strict=strict))

zip_longest(*others)

zip_longest(
    iter2: Iterable[T2],
) -> Iter[tuple[Option[T], Option[T2]]]
zip_longest(
    iter2: Iterable[T2], iter3: Iterable[T3]
) -> Iter[tuple[Option[T], Option[T2], Option[T3]]]
zip_longest(
    iter2: Iterable[T2],
    iter3: Iterable[T3],
    iter4: Iterable[T4],
) -> Iter[
    tuple[Option[T], Option[T2], Option[T3], Option[T4]]
]
zip_longest(
    iter2: Iterable[T2],
    iter3: Iterable[T3],
    iter4: Iterable[T4],
    iter5: Iterable[T5],
) -> Iter[
    tuple[
        Option[T],
        Option[T2],
        Option[T3],
        Option[T4],
        Option[T5],
    ]
]
zip_longest(
    iter2: Iterable[T],
    iter3: Iterable[T],
    iter4: Iterable[T],
    iter5: Iterable[T],
    iter6: Iterable[T],
    /,
    *iterables: Iterable[T],
) -> Iter[tuple[Option[T], ...]]

Return a zip Iterator who yield a tuple where the i-th element comes from the i-th iterable argument.

Yield values until the longest iterable in the argument sequence is exhausted, and then it raises StopIteration.

The longest iterable determines the length of the returned iterator, and will return Some[T] until exhaustion.

When the shorter iterables are exhausted, they yield NONE.

Parameters:

Name Type Description Default
*others Iterable[Any]

Other iterables to zip with.

()

Returns:

Type Description
Iter[tuple[Option[Any], ...]]

Iter[tuple[Option[Any], ...]]: An iterable of tuples containing optional elements from the zipped iterables.

Example:

>>> import pyochain as pc
>>> pc.Iter([1, 2]).zip_longest([10]).collect()
Seq((Some(1), Some(10)), (Some(2), NONE))
>>> # Can be combined with try collect to filter out the NONE:
>>> pc.Iter([1, 2]).zip_longest([10]).map(lambda x: pc.Iter(x).try_collect()).collect()
Seq(Some(Vec(1, 10)), NONE)

Source code in src/pyochain/_iter.py
2222
2223
2224
2225
2226
2227
2228
2229
2230
2231
2232
2233
2234
2235
2236
2237
2238
2239
2240
2241
2242
2243
2244
2245
2246
2247
2248
2249
2250
2251
def zip_longest(self, *others: Iterable[Any]) -> Iter[tuple[Option[Any], ...]]:
    """Return a zip Iterator who yield a tuple where the i-th element comes from the i-th iterable argument.

    Yield values until the longest iterable in the argument sequence is exhausted, and then it raises StopIteration.

    The longest iterable determines the length of the returned iterator, and will return `Some[T]` until exhaustion.

    When the shorter iterables are exhausted, they yield `NONE`.

    Args:
        *others (Iterable[Any]): Other iterables to zip with.

    Returns:
        Iter[tuple[Option[Any], ...]]: An iterable of tuples containing optional elements from the zipped iterables.

    Example:
    ```python
    >>> import pyochain as pc
    >>> pc.Iter([1, 2]).zip_longest([10]).collect()
    Seq((Some(1), Some(10)), (Some(2), NONE))
    >>> # Can be combined with try collect to filter out the NONE:
    >>> pc.Iter([1, 2]).zip_longest([10]).map(lambda x: pc.Iter(x).try_collect()).collect()
    Seq(Some(Vec(1, 10)), NONE)

    ```
    """
    return Iter(
        tuple(Option(t) for t in tup)
        for tup in itertools.zip_longest(self._inner, *others, fillvalue=None)
    )