How to fix Spyder breaks at line one during debugging without any breakpoint - spyder

My issue is my code during debugging always stops at line 1. For example this is my code:
import gc
import os
import pandas as pd
import shutil
from pandas import DataFrame
from pathlib import Path
from datetime import datetime
from collections import OrderedDict
It simply behaves as if there is a breakpoint in line 1. Before there were comments on the first few lines and Spyder breaks in the middle of the comments. I have to press "Continue" for it to proceed.
Now this does not happen in my other python files so I really don't have the slightest clue how to fix this unless I incrementally write the program and run it.
Anyone face such issue?

in the setting -->iPython console -> debugger, you will see an option "stop debugging on first line of file without breakpoints"

Related

Where is MobyLCPSolver?

Where is MobyLCPSolver?
ImportError: cannot import name 'MobyLCPSolver' from 'pydrake.all' (/home/docker/drake/drake-build/install/lib/python3.8/site-packages/pydrake/all.py)
I have the latest Drake and cannot import it.
Can anyone help?
As of pydrake v1.12.0, the MobyLcp C++ API is not bound in Python.
However, if you feed an LCP into Solve() then Drake can choose Moby to solve it. You can take advantage of this to create an instance of MobyLCP:
import numpy as np
from pydrake.all import (
ChooseBestSolver,
MakeSolver,
MathematicalProgram,
)
prog = MathematicalProgram()
x = prog.NewContinuousVariables(2)
prog.AddLinearComplementarityConstraint(np.eye(2), np.array([1, 2]), x)
moby_id = ChooseBestSolver(prog)
moby = MakeSolver(moby_id)
print(moby.SolverName())
# The output is: "Moby LCP".
# The C++ type of the `moby` object is drake::solvers::MobyLCP.
That only allows for calling Moby via the MathematicalProgram interface, however. To call any MobyLCP-specific C++ functions like SolveLcpFastRegularized, those would need to be added to the bindings code specifically before they could be used.
You can file a feature request on the Drake GitHub page when you need access to C++ classes or functions that aren't bound into Python yet, or even better you can open a pull request with the bindings that you need.

nltk.download('punkt') giving output as false

When I trying to install nltk and download the file punket using nltk.download('punkt').
I am getting the following errors. Have tried many alternative codes and changing networks.
error
Please help with this error.
Post applying :-
= df['num_words'] = df['text'].apply(lambda x:len(nltk.word_tokenize(x)))
I am gettring the error:-
**Resource punkt not found.
Please use the NLTK Downloader to obtain the resource:
import nltk
nltk.download('punkt')
For more information see: https://www.nltk.org/data.html
Attempted to load tokenizers/punkt/english.pickle**
I tried some alternative codes like
import nltk
import ssl
try:
_create_unverified_https_context = ssl._create_unverified_context
except AttributeError:
pass
else:
ssl._create_default_https_context = _create_unverified_https_context
nltk.download()
Also tried changing the networks as at some places I found it is saying server issue.
Try to launch the jupyter notebooks session as administrator (open the command or anaconda prompt as administrator).
The last option would be to download the corpus manually. You may find this, helpful in your case.

Biopython SeqIO error: local variable 'qual' referenced before assignment

I send some samples for Sanger sequencing to a commercial facility. I'm able to read the files they send using the command
from Bio import SeqIO
from Bio import Seq
rec = SeqIO.read("isolation-round4/3dr23_Forward.ab1",'abi-trim').seq
But recently, due to a move, we had to send the samples elsewhere for sequencing. Now, if I try to run the same command on the output I get an error:
UnboundLocalError: local variable 'qual' referenced before assignment in
File "C:\Users\Anaconda3\lib\site-packages\Bio\SeqIO\AbiIO.py", line 462, in AbiIterator letter_annotations={"phred_quality": qual}
I would appreciate any help in dealing with this. Here are two files, one that works and one that does not, if you would like to have a look.
Thanks in advance for your help!
Bug should have already been fixed in Biopython 1.77
Update: See https://github.com/biopython/biopython/issues/3221 - turned out to be a new unexpected configuration of the ABI software producing files with no quality scores.

Can Kivy be runned on windows?

This question has been asked a lot of times and I guess it's my turn to ask it now. I'm on windows using WSL2. I ran a simple kivy code to simulate a chess board.
import kivy
from kivy.app import App
from kivy.uix.widget import Widget
from kivy.graphics import Rectangle, Color
class ChessBoard(Widget):
def __init__(self, **kwargs):
"Making the chess board"
super(ChessBoard, self).__init__(**kwargs)
with self.canvas:
self.board = Rectangle(source = 'blank-chess-board.gif', pos = (125, 90), size = (520, 520))
class ChessApp(App):
def build(self):
return ChessBoard()
if __name__ == '__main__':
ChessApp().run()
It said: [CRITICAL] [App ] Unable to get a Window, abort.
Yesterday, I fixed the bug by installing miniconda and running my program on the command prompt.
But today, when I run it on the command prompt, it doesn't work. I haven't change the code since yesterday. What am I doing wrong? Thanks, help would be appreciated.

Neo4j-admin import bad tolerance

I have an neo4j-admin import script set up with --bad-tolerance=100000 (note, also tried --bad-tolerance 100000) as a flag. My script fails during import during the collect dense nodes step with the following message: unexpected error: Too many bad entries 1001, where last one was: InputRelationship:...
I thought bad tolerance was supposed to address that flag so that it would fail at the (in this case) 100,001st bad entry?
bin/neo4j-admin import is feature wise not yet on par with the good old bin/neo4j-import tool - which is marked deprecated in 3.1.1.
To use --bad-tolerance you need to go back to use bin/neo4j-import.

Resources