CMU 18-879 instructions

Instructions

CMU 18-879: Homework 1

Please read the following instructions carefully before attempting the problems:

• Submission Deadline: January 30, 2025, by 11:59 PM.

• Allowed Resources: Lecture slides and class notes. Online resources are allowed only if they are directly related to the course material and should be properly cited and declared in your submission. Please ask the instructors if you are not sure whether the resources can be used. You are not allowed to use ChatGPT or other AI-based tools for the homework.

• Submitting Your Work: Please submit your homework solutions to Grade- scope. All code written must be submitted to receive credits. Please organize your code clearly and provide instructions on how to run your code, since the course staff will manually grade your submissions.

• Collaborative Work: Collaboration is allowed; however, each student must submit an individual report detailing their own work and understanding of the problems. Please declare any assistance received, including discussions with others about the homework problems or the use of any online materials in your submissions.

• Late Submissions: Each student will be granted 6 late days in total, with a maximum of 2 late days allowed per homework.

1

18-879 Homework 1 1. Extracellular Stimulation of a Single Neuron

In the first part of the homework, you will gain hands-on experience working with a simulator for extra-cellular neural stimulation used in [1]. The handout includes the code to generate the membrane potential of a single excitatory or inhibitory neuron under the stimulation of a point electrode. For the simulation of a single neuron, you should be able to run the program on your local computer. The simulator employs the realistic morphologies and dynamics of excitatory pyramidal (Pyr) and inhibitory parvalbumin (PV) neurons.

This code forms the first testbed for this course, i.e., a simulator on which you can experiment by supplying input current and examining responses.

Running the Code Follow the README file in the simulation folder to install required packages. To perform a simulation, you should call the following function as in starter.py:

   membrane_potential,t = pointElec_sim(num_electrode, amplitude,

   pulse_width, period, total_time, cell_type, plot_neuron_with_electrodes)

• num electrode: The number of electrodes surrounding the neuron,

• amplitude: The amplitude of the square wave (in μA),

• pulse width: The duration of the pulse (in ms),

• period: The period of the signal (in ms),

• total time: The total length of the signal (in ms),

• cell type: The type of neuron being stimulated (‘Pyr’ or ‘PV’).

• plot neuron with electrodes: Set to True to visualize the neuron morphol- ogy and electrode locations.

• membrane potential: An ndarray of membrane potential (num electrode × num time points), in mV .

• t: The time array (in ms).

You can visualize the morphology of the neurons and the locations of the elec- trodes when you run the code. For this homework, the neuron ID of pyramidal and PV neurons were hardcoded to be 6 and 36 respectively. Please feel free to explore different cell morphologies from cell id pyr lst and cell id PV lst.

2

18-879 Homework 1

 Problem 1. Time-series Analysis: Spike Detection and Firing Rate For this problem, set num electrode = 1, amplitude = 300, pulse width = 50, period = 100, total time = 400. (1) Plot the membrane potential over time. (2) Use the SciPy find peaks function with prominence = 40 to detect spikes. The prominence measures how much a signal stands out from the surrounding baseline. Calculate the firing rate (number of spikes per second). Plot the firing rate over the stimulation duration. Feel free to choose a bin size that you find appropriate. A total of 4 plots (2 plots for each neuron) should be submitted.

Problem 2. Effects of Pulse Width on Neural Response. In this task, we will modify the waveform of current injection and analyze how the neural responses change for both the pyramidal and PV neuron. For a stimulation amplitude of 300μA, modify the pulse width of the square wave input (pulse width ∈ [20, 50, 80]). (1) Plot the membrane potentials. (2) Plot the firing rate. Describe any qualitative observations. 12 plots (2 neurons × 3 pulse widths × 2 metric) should be submitted.

Problem 3. Effects of Stimulation Amplitude on Neural Response. The strength-duration curve describes electrode threshold current as function of stimulus pulse duration. For pulse width ∈ [10, 20, 30, 40, 50, 60, 70, 80, 90, 100], calculate the corresponding minimum amplitude Amin of the square pulse that elicits a spike. You can set total time = 100, period = 100. Feel free to use any search methods (e.g. binary search) to find Amin. Plot the strength duration curve (x-axis: pulse width, y-axis: Amin).

Problem 4. Effects of Direction of Stimulation. Set num electrode = 4 to perform individual stimulations, where each uses one electrode at a time. The 4 electrodes are positioned at different locations on a sphere centered at the neuron. Set amplitude=300, pulse width=50, period=100, total time = 400. Plot the mem- brane potential for the 4 electrodes for both neurons. Please comment on the sen- sitivity of the neural responses to the direction of stimulation. A total of 8 plots should be submitted.

3

18-879 Homework 1 2. Exploring a nonlinear dynamical system

Problem 5. Consider the following two-dimensional dynamical system:

dx =αx−y−αx(x2 +y2) dt

dy =x+αy−αy(x2 +y2). dt

The system exhibits an interesting phenomenon called the “limit cycle”. In partic- ular, it has a trajectory where the state (x,y) repeats itself after some fixed time interval.

a) Demonstrate that this system has a limit cycle. Hint: you might need to convert x and y into polar coordinates.

b) Find out if the limit cycle is an “attractor state” (i.e., it is stable and attracts nearby trajectories) or “repeller state” (i.e., it is unstable and drives nearby trajec- tories away).

 4

18-879 Homework 1 References

[1] Sara Caldas-Martinez et al. “Cell-specific effects of temporal interference stim- ulation on cortical function”. In: Communications Biology 7.1 (2024), p. 1076.

Gaussian Haircut - 了解项目: - 了解 [GaussianHaircut](https://eth-ait.github.io/GaussianHaircut/) ; - 先了解该项目: [GaussianHaircut](https://github.com/eth-ait/GaussianHaircut) , use context7 ; - 把这个项目改成WIndows的项目: - 参考install.sh和run.sh, 新建两个bat文件: install.bat 和 run.bat ; - 安装部署的时候验证必须的软件是否正确安装, 如果没有, 则帮助下载安装 ; - 用./micromamba.exe 来创建和管理虚拟环境 ; - 重写Readme,用中文, 并标注所需要安装的软件如: Blender, CMake, CUDA, Colmap(https://github.com/colmap/colmap/releases), use context7 ; - 注意环境变量要设置正确 - Code Rules: - 文件修改完成后则用命令行运行测试, 并修正Bug - 不要随意新建文件导致混乱 - 默认环境变量: - PROJECT_DIR=%CD% - DATA_PATH=%PROJECT_DIR%\data - ENV_PATH=%PROJECT_DIR%\envs - MAMBA=%PROJECT_DIR%\micromamba.exe - CUDA_DIR=C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v11.8 - BLENDER_DIR=C:\Program Files\Blender Foundation\Blender 3.6 - COLMAP_DIR=C:\Colmap\bin - CMAKE_DIR=C:\Program Files\CMake\bin - GIT_DIR=C:\Program Files\Git\bin - VCVARS_DIR=C:\Program Files\Microsoft Visual Studio\2022\Professional\VC\Auxiliary\Build - install.bat 安装主要步骤: 1 环境检查与设置环境变量 2 用micromamba设置虚拟环境,并测试 3 拉取代码与依赖 4 构建必要模块(如pytorch,openpose,pixie,detectron2 等等) 5 下载大模型 6 测试 - run.bat 运行主要步骤: - 预处理: - 将原始图像排列成 3D 高斯 Splatting 格式 - 运行 COLMAP 重建并对图像和相机进行去畸变 - 运行 Matte-Anything - 调整图像大小 - 使用图像的 IQA 分数进行过滤 - 计算方向图 - 运行 OpenPose - 运行 Face-Alignment - 运行 PIXIE - 将所有 PIXIE 预测合并到一个文件中 - 将 COLMAP 相机转换为 txt 格式 - 将 COLMAP 相机转换为 H3DS 格式 - 删除原始文件以节省磁盘空间 - 重建: - 运行 3D 高斯 Splatting 重建 - 运行 FLAME 网格拟合 - 裁剪重建场景 - 移除与 FLAME 头部网格相交的头发高斯分布 - 运行训练视图渲染 - 获取 FLAME 网格头皮图 - 运行潜在发束重建 - 运行发束重建 - 可视化: - 将生成的发束导出为 pkl 和 ply 文件 - 渲染可视化效果 - 渲染线条 - 制作视频 - 必要参考: - [Gaussianhaircut](https://eth-ait.github.io/GaussianHaircut/), use context7 ; - [NeuralHaircut](https://github.com/egorzakharov/NeuralHaircut), use context7 ; - [micromamba](https://mamba.readthedocs.io/en/latest/user_guide/micromamba.html), use context7 ; - [diff_gaussian_rasterization_hair](https://github.com/g-truc/glm), use context7 ; - [Matte-Anything](https://github.com/hustvl/Matte-Anything), use context7 ; - [detectron2](https://github.com/facebookresearch/detectron2), use context7 ; - [colmap](https://colmap.github.io/), use context7 ; - [openpose](https://github.com/CMU-Perceptual-Computing-Lab/openpose), use context7 ; - [pytorch3d](https://github.com/facebookresearch/pytorch3d), use context7 ; - [simple-knn](https://github.com/camenduru/simple-knn), use context7 ; - [kaolin](https://github.com/NVIDIA/kaolin), use context7 ; - [hyperIQA](https://github.com/SSL92/hyperIQA), use context7 ; ``` # Prerequisites: # # 1. Install CUDA 11.8 # Follow intructions on https://developer.nvidia.com/cuda-11-8-0-download-archive # Make sure that # - PATH includes <CUDA_DIR>/bin # - LD_LIBRARY_PATH includes <CUDA_DIR>/lib64 # If needed, restart bash environment # The environment was tested only with this CUDA version # 2. Install Blender 3.6 to create strand visualizations # Follow instructions on https://www.blender.org/download/lts/3-6 # # Need to use this to activate conda environments eval "$(conda shell.bash hook)" # Save parent dir PROJECT_DIR=$PWD # Pull all external libraries mkdir ext cd $PROJECT_DIR/ext && git clone https://github.com/CMU-Perceptual-Computing-Lab/openpose --depth 1 cd $PROJECT_DIR/ext/openpose && git submodule update --init --recursive --remote cd $PROJECT_DIR/ext && git clone https://github.com/hustvl/Matte-Anything cd $PROJECT_DIR/ext/Matte-Anything && git clone https://github.com/IDEA-Research/GroundingDINO.git cd $PROJECT_DIR/ext && git clone git@github.com:egorzakharov/NeuralHaircut.git --recursive cd $PROJECT_DIR/ext && git clone https://github.com/facebookresearch/pytorch3d cd $PROJECT_DIR/ext/pytorch3d && git checkout 2f11ddc5ee7d6bd56f2fb6744a16776fab6536f7 cd $PROJECT_DIR/ext && git clone https://github.com/camenduru/simple-knn cd $PROJECT_DIR/ext/diff_gaussian_rasterization_hair/third_party && git clone https://github.com/g-truc/glm cd $PROJECT_DIR/ext/diff_gaussian_rasterization_hair/third_party/glm && git checkout 5c46b9c07008ae65cb81ab79cd677ecc1934b903 cd $PROJECT_DIR/ext && git clone --recursive https://github.com/NVIDIAGameWorks/kaolin cd $PROJECT_DIR/ext/kaolin && git checkout v0.15.0 cd $PROJECT_DIR/ext && git clone https://github.com/SSL92/hyperIQA # Install environment cd $PROJECT_DIR && conda env create -f environment.yml conda activate gaussian_splatting_hair # Download Neural Haircut files cd $PROJECT_DIR/ext/NeuralHaircut gdown --folder https://drive.google.com/drive/folders/1TCdJ0CKR3Q6LviovndOkJaKm8S1T9F_8 cd $PROJECT_DIR/ext/NeuralHaircut/pretrained_models/diffusion_prior # downloads updated diffusion prior gdown 1_9EOUXHayKiGH5nkrayncln3d6m1uV7f cd $PROJECT_DIR/ext/NeuralHaircut/PIXIE gdown 1mPcGu62YPc4MdkT8FFiOCP629xsENHZf && tar -xvzf pixie_data.tar.gz ./ && rm pixie_data.tar.gz cd $PROJECT_DIR/ext/hyperIQA && mkdir pretrained && cd pretrained gdown 1OOUmnbvpGea0LIGpIWEbOyxfWx6UCiiE cd $PROJECT_DIR # Matte-Anything conda create -y -n matte_anything \ pytorch=2.0.0 pytorch-cuda=11.8 torchvision tensorboard timm=0.5.4 opencv=4.5.3 \ mkl=2024.0 setuptools=58.2.0 easydict wget scikit-image gradio=3.46.1 fairscale \ -c pytorch -c nvidia -c conda-forge # this worked better than the official installation config conda deactivate && conda activate matte_anything pip install git+https://github.com/facebookresearch/segment-anything.git python -m pip install 'git+https://github.com/facebookresearch/detectron2.git' cd $PROJECT_DIR/ext/Matte-Anything/GroundingDINO && pip install -e . pip install supervision==0.22.0 # fixes the GroundingDINO error cd $PROJECT_DIR/ext/Matte-Anything && mkdir pretrained cd $PROJECT_DIR/ext/Matte-Anything/pretrained wget https://dl.fbaipublicfiles.com/segment_anything/sam_vit_h_4b8939.pth wget https://github.com/IDEA-Research/GroundingDINO/releases/download/v0.1.0-alpha/groundingdino_swint_ogc.pth conda deactivate && conda activate gaussian_splatting_hair gdown 1d97oKuITCeWgai2Tf3iNilt6rMSSYzkW # OpenPose cd $PROJECT_DIR/ext/openpose gdown 1Yn03cKKfVOq4qXmgBMQD20UMRRRkd_tV && tar -xvzf models.tar.gz && rm models.tar.gz # downloads openpose checkpoint conda deactivate git submodule update --init --recursive --remote conda create -y -n openpose cmake=3.20 -c conda-forge # needed to avoid cmake complining error conda activate openpose sudo apt install libopencv-dev # installation instructions are from EasyMocap, in case of problems refer to the official OpenPose docs sudo apt install protobuf-compiler libgoogle-glog-dev sudo apt install libboost-all-dev libhdf5-dev libatlas-base-dev mkdir build cd build cmake .. -DBUILD_PYTHON=true -DUSE_CUDNN=off make -j8 conda deactivate # PIXIE cd $PROJECT_DIR/ext && git clone https://github.com/yfeng95/PIXIE cd $PROJECT_DIR/ext/PIXIE chmod +x fetch_model.sh && ./fetch_model.sh conda create -y -n pixie-env python=3.8 pytorch==2.0.0 torchvision==0.15.0 torchaudio==2.0.0 \ pytorch-cuda=11.8 fvcore pytorch3d==0.7.5 kornia matplotlib \ -c pytorch -c nvidia -c fvcore -c conda-forge -c pytorch3d # this environment works with RTX 4090 conda activate pixie-env pip install pyyaml==5.4.1 pip install git+https://github.com/1adrianb/face-alignment.git@54623537fd9618ca7c15688fd85aba706ad92b59 # install this commit to avoid error ``` 将这段代码装成windows的bat,并保证代码运行无误,注意有一些部分是需要构建的
05-14
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值