Skip to content

Added STDP AltAI example, and refactored example accordingly.#150

Open
DavidIkov wants to merge 32 commits intoKasperskyLab:masterfrom
DavidIkov:examples_refactor
Open

Added STDP AltAI example, and refactored example accordingly.#150
DavidIkov wants to merge 32 commits intoKasperskyLab:masterfrom
DavidIkov:examples_refactor

Conversation

@DavidIkov
Copy link
Collaborator

No description provided.

Comment on lines +35 to +47
template <>
std::function<knp::core::messaging::SpikeData(knp::core::Step)>
make_training_labels_spikes_generator<knp::neuron_traits::BLIFATNeuron>(const Dataset& dataset)
{
return [&dataset](knp::core::Step step)
{
knp::core::messaging::SpikeData message;

knp::core::Step local_step = step % steps_per_image;
if (local_step == 11) message.push_back(dataset.get_data_for_training().first[step / steps_per_image].first);
return message;
};
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

А почему вдруг метки как-то уникально кодируются?
Давай, хотя бы часть, вынесем.
Там посмотрим.

constexpr double min_potential = 0;
constexpr float stability_change_parameter = 0.05F;
constexpr uint32_t resource_drain_coefficient = 27;
constexpr float stochastic_stimulation = 2.212;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

А это что?
Комментарии добавь.
Тут далеко не всё понятно.


using Dataset = knp::framework::data_processing::classification::images::Dataset;

Dataset process_dataset(ModelDescription const& model_desc);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Неплохо бы комментарий оставить, в чём заключается "процесс".

Comment on lines -160 to -172
std::cout << get_time_string() << ": learning started\n";

model_executor.start(
[&dataset](size_t step)
{
if (step % 20 == 0) std::cout << "Step: " << step << std::endl;
return step != dataset.get_steps_amount_for_training();
});

std::cout << get_time_string() << ": learning finished\n";
example_network.network_ = get_network_for_inference(
*model_executor.get_backend(), example_network.data_.inference_population_uids_,
example_network.data_.inference_internal_projection_);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Почему убран тайминг?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

смысла от него я не увидел,
первый тайминг сразу улетает в логах, а второй просто говорит время окончания работы модели,
смысл?
если и делать такой лог, то надо писать время которое ушло на работу модели, а не просто время, зачем мне просто время?
для всех измерений я пользовался командой time, этот лог был бесполезным.

+это доп файл, пример становится больше

}

// Start model.
std::cout << get_time_string() << ": learning started\n";
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Правда надо убирать тайминг?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

написал выше

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants