All Questions
102,296 questions
0
votes
0
answers
8
views
gdal_array can't be imported in Python-GDAL
Ok, so, I'm trying to process some .bil files in Debian 12
I installed GDAL using:
sudo apt install gdal-bin
sudo apt-get install libgdal-dev
And ogrinfo --version outputs:
GDAL 3.6.2, released 2023/...
-1
votes
0
answers
52
views
How do I do Eigen decomposition of a generalized Fibonacci matrix to arbitrary precision?
This is a follow up to my previous question.
I want to efficiently compute Nth term of higher order generalized Fibonacci numbers, N is sufficiently large such that the Nth term is guaranteed to be ...
0
votes
0
answers
49
views
Why doesn’t the barycenter method detect subpixel displacements where correlation does?
I’m working with X-ray imaging data.
I have a reference image containing a structured pattern, and a sample image where this pattern is slightly distorted due to the presence of a physical sample.
My ...
1
vote
1
answer
33
views
import gensim binary incompatibility
import gensim
import numpy
import scipy
print("gensim version:", gensim.__version__)
print("numpy version:", numpy.__version__)
print("scipy version:", scipy.__version__)
...
2
votes
2
answers
62
views
Pandas: Fill in missing values with an empty numpy array
I have a Pandas Dataframe that I derive from a process like this:
df1 = pd.DataFrame({'c1':['A','B','C','D','E'],'c2':[1,2,3,4,5]})
df2 = pd.DataFrame({'c1':['A','B','C'],'c2':[1,2,3],'c3': [np.array((...
-1
votes
1
answer
82
views
python interpreter getting killed when writing into large-ish numpy array (but much smaller than the RAM)
The following python code allocates an 8GB numpy array, and writes into it. It kills the python interpreter, regardless of the size of the RAM of the machine (it happens on a server with 384GB of RAM)....
2
votes
1
answer
53
views
Alternative to looping over one numpy axis
I have two numpy arrays a and b such that a.shape[:-1] and b.shape are broadcastable. With this constraint only, I want to calculate an array c according to the following:
c = numpy.empty(numpy....
0
votes
2
answers
66
views
Unexpected behavior with array slicing and mask
It was unexpected that
x=np.empty((2,10,5))
x.shape
>>> (2, 10, 5)
x[0].shape, x[0,:,:].shape
>>> ((10, 5), (10, 5))
mask = [True,True,True,False,False]
x[0,:,mask].shape
>>&...
1
vote
0
answers
23
views
Lightgbm custom multiclass objective vs native [migrated]
I am trying to familiarise myself with the custom objective function in lightgbm. As an exercise, I am trying to rewrite the multiclass classification log loss and compare the result with the inbuilt ...
-2
votes
2
answers
87
views
Fastest way to convert results from tuple of tuples to 2D numpy.array
I'm training my AI model with a huge data set, so that it's impossible to preload all the data into memory at the beginning. I'm currently using psycopg2 to load data from a Postgresql DB during ...
1
vote
1
answer
90
views
Format np.float64 without leading digits
I need to format np.float64 floating values without leading digits before the dot, for example -2.40366982307 as -.240366982307E+01, in python.
This is to allow me to write in RINEX 3.03 the values ...
0
votes
2
answers
34
views
Python Sklearn.Model_Selection giving error numpy.dtype size changed
I have a train test split code
from sklearn.model_selection import train_test_split
train_df, test_df = train_test_split(new_cleaned_df, test_size=0.05, random_state=42, shuffle=True)
train_df....
-1
votes
0
answers
33
views
Efficiently Finding the Indices of the N Largest Values in a NumPy Array Without Sorting the Entire Array [duplicate]
I'm working with very large NumPy arrays (millions to billions of elements) and need to find the indices of the N largest values in the array. Using np.argsort() followed by slicing to get the last N ...
2
votes
4
answers
98
views
Pandas - fillna multiple columns with a given series, matching by index?
I'd like to use a fillna command to fillna multiple columns of a Pandas dataframe with the same series, matching by index:
import numpy as np
import pandas as pd
df_1 = pd.DataFrame(index = [0, 1, 2],...
1
vote
1
answer
45
views
TypeError: Object of type ndarray is not JSON serializable despite custom converter (Nested Dict/NumPy 2.0)
I am working with simulation results stored in a deeply nested defaultdict structure. This structure (data_to_save) mixes standard Python types (lists, ints, floats, None) with NumPy arrays.
I need to ...