2,629 questions
3
votes
1
answer
49
views
How do let and case differ in terms of lazyness?
How do let and case differ in terms of lazyness? Where do I find the information myself? And how do I experiment with it in ghci?
Some context
This incomplete piece of code is an excerpt from
Parallel ...
-1
votes
0
answers
170
views
Is there any advantage to use lazy tokenizer other than space & time efficiency?
Recently I tried to implement one tokenizer following this Python doc example for Software Design for Flexibility (SDF) exercise 5.7 (notice this is not one book teaching compiler specifically ...
0
votes
0
answers
30
views
PySpark: `assert_unique` sometimes fails on first run, but passes on re-run without data changes
I'm using PySpark in a Jupyter Notebook and have a helper function that checks whether a column (or set of columns) has unique values:
def assert_unique(df: DataFrame, column_names: str | list[str]) -&...
3
votes
0
answers
69
views
force order of evaluation inside a tidypolars call
Is it possible in some way to force the evaluation order in a {tidypolars} pipe?
For example:
install.packages("tidypolars", repos = c("https://community.r-multiverse.org", '...
0
votes
1
answer
42
views
Why does Swift's lazy var solve the error: Cannot assign value of type '(HomeViewController) -> () -> HomeViewController' to type
Below code gives an error:
class HomeViewController: UIViewController, UITableViewDataSource, UITableViewDelegate {
let tableView = {
let t = UITableView(frame: .zero, style: .grouped)...
1
vote
0
answers
50
views
Lazy de-serialization of a JSON array as a Stream<T> [duplicate]
Consider I have a large (potentially infinite) array of JSON objects:
[
{
"id": 41,
"name": "foo"
},
{
"id": 42,
&...
4
votes
0
answers
55
views
Lazy evaluations for DataFrames
Let me provide quick demo which shows that second approach is 10x times slower than the first one.
import pandas as pd
from timeit import default_timer as timer
r = range(1,int(1e7))
df = pd....
5
votes
1
answer
94
views
Trouble with understanding std::views::filter -> Lazy Evaluation
auto ResultView
{
std::views::iota(1,30)
| std::views::take(20)
| std::views::filter([](int x){ return x%2!=0; })
};
std::cout<<...
0
votes
3
answers
98
views
Why does this haskell program have incorrect time complexity?
newtype Prob a = Prob { getProb :: [(a,Rational)] } deriving (Show,Eq,Functor)
flatten :: Prob (Prob a) -> Prob a
flatten (Prob xs) = Prob $ concat $ map multAll xs
where multAll (Prob innerxs,...
1
vote
2
answers
302
views
How should one migrate from lazy_static to std LazyLock/LazyCell
I created a library that uses lazy_static to create a static HashMap that other parts of the code can reference to look up values like this:
use lazy_static::lazy_static;
use std::collections::HashMap;...
2
votes
3
answers
118
views
Lazy initialization of const attributes in c++
I would like to carry out a lazy initialization of a set of (std::vector) attributes in c++. They have to be const, in the sense that after the first time they are initialized (via a get method), ...
1
vote
2
answers
180
views
Migrating use of .GetAwaiter().GetResult() in a lambda statement to async [closed]
Given this ConcurrentDictionary:
private readonly ConcurrentDictionary<TKey, Lazy<TValue>> _lazyDictionary = new();
I have a wrapping class that provides custom access to the dictionary......
1
vote
0
answers
65
views
Using lazy-seq with Datomic Query Results: Is this the right approach?
I'm working with Datomic and need to process a large result set from a query. I want to make the processing lazy to handle the large output efficiently. Here's my current approach:
(let [normalized-...
15
votes
1
answer
358
views
Why don't replacement functions use lazy evaluation?
Replacement functions, such as names<-, seem to not use lazy evaluation when called like names(x) <- c("a", "b").
To demonstrate, let's define a function to get the ...
0
votes
1
answer
309
views
ETL slow after Databricks Runtime upgrade
We are in process of upgrading Databricks platform.
Couple of weeks ago we setup Unity Catalog.
Now we are trying to go from Databricks Runtime 13.3 LTS to 15.4 LTS.
Two notebooks that we running (out ...