Package: binaryRL
Version: 0.8.9
Title: Reinforcement Learning Tools for Two-Alternative Forced Choice
        Tasks
Description: Tools for building reinforcement learning (RL) models
  specifically tailored for Two-Alternative Forced Choice (TAFC) tasks,
  commonly employed in psychological research. These models build upon
  the foundational principles of model-free reinforcement learning detailed in
  Sutton and Barto (1998) <ISBN:0262039249>. The package allows
  for the intuitive definition of RL models using simple if-else
  statements. Our approach to constructing and evaluating these
  computational models is informed by the guidelines proposed in
  Wilson & Collins (2019) <doi:10.7554/eLife.49547>. Example
  datasets included with the package are sourced from the work of
  Mason et al. (2024) <doi:10.3758/s13423-023-02415-x>.
Authors@R: 
  c(person(
    given = "YuKi",
    role = c("aut", "cre"),
    email = "hmz1969a@gmail.com",
    comment = c(ORCID = "0009-0000-1378-1318")
  ))
Maintainer: YuKi <hmz1969a@gmail.com>
URL: https://github.com/yuki-961004/binaryRL
BugReports: https://github.com/yuki-961004/binaryRL/issues
License: GPL-3
Encoding: UTF-8
LazyData: TRUE
RoxygenNote: 7.3.2
Depends: R (>= 4.0.0)
Imports: future, doFuture, foreach, doRNG, progressr
Suggests: stats, GenSA, GA, DEoptim, pso, mlrMBO, mlr, ParamHelpers,
        smoof, lhs, DiceKriging, rgenoud, cmaes, nloptr
NeedsCompilation: no
Packaged: 2025-06-13 10:06:04 UTC; hmz19
Author: YuKi [aut, cre] (ORCID: <https://orcid.org/0009-0000-1378-1318>)
Repository: CRAN
Date/Publication: 2025-06-15 09:10:02 UTC
